Tutorials: How to prepared Demo scene tutorial

  Рет қаралды 13,456

MetaHumanSDK

MetaHumanSDK

2 жыл бұрын

MetaHuman SDK is an automated AI solution to generate realistic animation for characters. This Unreal Engine plugin allows you to create and use lip sync animation generated by our cloud server.
In this tutorial, we describe in detail how to start using our plugin: add a TTS and a voice track, as well as merge everything into a combo request.
Test and tell us what you got in the comments.
Link to our discord: / discord
Link to our website: metahumansdk.io/
Get the plugin for free from Unreal Engine marketplace: www.unrealengine.com/marketpl...
#Unreal Engine#MetaHuman#MetaHumanSDK
• Video
MetaHumanSDK is a set of tools for creation of an immersive interaction with a digital human. Our service creates facial animation from an input audio file or text and the plugin includes connectivity modules of a synthesized voice from Google or Azure (text to speech), the creation of an interactive chat - connection to Dialogflow (Google) with the possibility of a live dialogue with a digital human
Link to our discord: / discord
Link to our website: metahumansdk.io/
Link to our plagin on Unreal Engine marketplace: www.unrealengine.com/marketpl...
#UnrealEngine #MetaHuman #MetaHumanSDK #DigitalAvatar

Пікірлер: 68
@AndreiGhenoiu
@AndreiGhenoiu 2 жыл бұрын
Thank you so much for this! For metahumans, in the Generate Lipsync animation popup, once you select your Face_Archetype_Skeleton, under Mappings Mode, choose EMapping asset, and for the Mapping assets choose DigitalHuman_To_MetaHuman_Mappings. hope this helps.
@veith3dclub
@veith3dclub 2 жыл бұрын
Thank you!!!!
@commanderskullySHepherdson
@commanderskullySHepherdson 11 ай бұрын
is this in 4.27 ? im using 5.1 no luck
@NeoxEntertainment
@NeoxEntertainment 2 жыл бұрын
Hey awesome SDK and Video about it ! I had some problems with setting that up for a metahuman i created i also imported the audio and opened the lilpsync plugin to select the skeleton i used the one from the face of my metahuman but didnt worked do you know what i did wrong ?
@metahumansdk
@metahumansdk 2 жыл бұрын
Hey! Please, stay tuned for our updates. We are going to record additional tutorials specially for MetaHumans. You can follow for roadmap and bug fixes here metahumansdk.canny.io/ In your case, could you send the screenshot of "generate lipsync" windows parameters?
@AlexandarKaravelov
@AlexandarKaravelov 2 жыл бұрын
Hi there! Aren't you already used to Unreal engine tutorials? :)) So what i did to remap the lipsincing to a metahuman character is: 1. You need the Face_Archetype_Skeleton from the face of your metahuman character. 2. Right click on the WAV audio file and create Lipsync animation 3.Choose your WAV file (sound) 4.Set the skeleton to the Face_Archetype_Skeleton 5.Set Mapping mode to EMappingAsset 6.Tick the Set up for Metahuman box 7.Under mapping asset you should have a DigitalHuman_to_Metahuman_Mapping asset and you should choose it! 8.Generate 9.TADAAA! :) Now the animation works perfectly on the metahuman character (Face)!
@NeoxEntertainment
@NeoxEntertainment 2 жыл бұрын
@@AlexandarKaravelov awesome thank you :)
@donhsi
@donhsi 2 жыл бұрын
how does this compare to NVIDIA audio2face?
@niharika1003
@niharika1003 Жыл бұрын
Hello currently trying out your plug-in. On Mac once we click generate lipsync Unreal crashes. On windows, nothing is generated after clicking generate. Please help! This is too cool to miss !
@metahumansdk
@metahumansdk Жыл бұрын
Hi Niharika! Can you share your log file to our support mail support@metahumansdk.io or on the discord server discord.gg/MJmAaqtdN8 ?
@groovygrover6178
@groovygrover6178 2 жыл бұрын
If u want to use a selfmade character, how do you setup the skeleton of your character to make it work with this plugin?
@metahumansdk
@metahumansdk 2 жыл бұрын
You can use any skeleton settings, the main point to use right rig for character face
@georges8408
@georges8408 2 жыл бұрын
it is interesting !!! but the truth is that it is a bit outdated since now with an iphone you can sync both lyps and facial motion... also there is a question here : if someone wants to make a lipsync he also wanted a facial sync or animation... so although it is very interesting (and free) it is a bit useless for a more holistic facial mocap animation.
@metahumansdk
@metahumansdk 2 жыл бұрын
Thank you for your comment! Actually, we are solving different tasks... use our plugin you don't need create/record animation at all, you just need audio and you can create animation runtime. About facial sync, it's included and we are improving it in each version :)
@NickAB94
@NickAB94 2 жыл бұрын
The live link and other realtime solutions aren't quite good enough for me. Ive been using nvidia's A2F as a layer on top of the face mocap (From Faceware which has better lip sync even than the Iphone link) which for me has slightly improved the lip movement quality. I'm yet to try this one out
@PerFeldvoss
@PerFeldvoss 2 жыл бұрын
Thanks, but where do I find instructions on how to get a "lipsync animation generated by our cloud server"? I assume that you can not use 'any' audio file... if so the instructions are not clear?
@metahumansdk
@metahumansdk 2 жыл бұрын
You can find full documentation here arvrlab.notion.site/MetaHuman-SDK-Documentation-555624b940b943be8e0d7096cc0eb6ba Recommended audio file parameters: - Specifications: PCM - Format: WAV - Bitrate: 16 - Sample Rate: 16000 Hz - Speaker Channels: Mono
@user-js7br9gq8u
@user-js7br9gq8u 2 жыл бұрын
Is there any relevant expression file that can be used together? If a simple voice dialogue, the character knows that his eyes are particularly dull, and he needs the expression without lip action, such as anger, happiness, etc., in order to make the action expression that conforms to the real situation. Are there any relevant solutions?
@metahumansdk
@metahumansdk 2 жыл бұрын
Hey! If I got right, you need different expression for different situation. You can generate appropriate animation file, for different emitions. Right now only calm, happy and angry available, but in the futher more, we are working on it. Stay tune for update of roadmap metahumansdk.canny.io/
@user-js7br9gq8u
@user-js7br9gq8u 2 жыл бұрын
@@metahumansdk I don't seem to be able to map the generated animation to my metahumans' skeleton. I can generate files and see expression curves, such as eyes and mouth, but the character's face doesn't move in the animation. It seems that this has something to do with name.
@AnchorLee
@AnchorLee 2 жыл бұрын
I am really excited about this plugins...I can generate a face animation, but there is nothing in it... the lips don't work. Really frustrating...
@metahumansdk
@metahumansdk 2 жыл бұрын
Thank you for using our plugin. Firstly, could you try used demo scene, included in plugin content, for better understanding pipeline.
@AnchorLee
@AnchorLee 2 жыл бұрын
@@metahumansdk Thank you for the reply. Where can I get the demo scene? I’ve tried, but can’t find it.
@user-jj8ld2dw7k
@user-jj8ld2dw7k Жыл бұрын
The following modules are missing orbuilt with a different engine version: MetahumanSDK 100 MetahumanSDKEditor 100 Engine modules cannot be compiled atruntime. Please build through your IDE.
@metahumansdk
@metahumansdk Жыл бұрын
Hi 晓龙纹路 Perhaps you try to use SDK on the wrong version of Engine. Can you share more details aboud SDK version and Engine version? Also you can ask questions in our discord with logs and screenshots.
@user-jn4gs3uw6i
@user-jn4gs3uw6i 2 жыл бұрын
Hello, why is DigitalHumanAnimation Content not displayed after I open the plugin and set the project settings?I checked the download version, there should be no download error
@metahumansdk
@metahumansdk 2 жыл бұрын
Hello! Don't forget settings "Show content" for plugins in UE
@user-jn4gs3uw6i
@user-jn4gs3uw6i 2 жыл бұрын
@@metahumansdk Thanks! the problem is solved
@mistert2962
@mistert2962 2 жыл бұрын
How long should audio files be? I tested 15 minutes audio but it is not working. 2 minutes audio is working fine. 3 minutes I am trying right now.
@metahumansdk
@metahumansdk 2 жыл бұрын
Actually, it doesn't matter, but could you clarify what kind of audio file do you use? please, send detailed settings.
@mistert2962
@mistert2962 2 жыл бұрын
@@metahumansdk I wanted to create a 15 minutes long animation for my 15 minutes long learning topic. But now MetaHumanSDK seems to be gone. I can not connect to your server. Is it because of this whole war thing?
@georgeluna6217
@georgeluna6217 2 жыл бұрын
any tutorial on how to petit up with meta human?
@metahumansdk
@metahumansdk 2 жыл бұрын
Hey! Please, stay tuned for our updates. We are going to record additional tutorials specially for MetaHumans. You can follow for roadmap and bug fixes here metahumansdk.canny.io/
@JCPhotographyMallorca
@JCPhotographyMallorca 2 жыл бұрын
nothing happens when i try to generate the lipsync
@metahumansdk
@metahumansdk 2 жыл бұрын
Hey! Please try it on Default scene and follow the tutorial, do you have any questions about that?
@JCPhotographyMallorca
@JCPhotographyMallorca 2 жыл бұрын
​@@metahumansdk got it to work sorry just took a little searching !, just trying to figure out how you can get it to work on a metahuman rather than the sample project
@jonlon4406
@jonlon4406 Жыл бұрын
::Audio to lipsync) this node is too slow in runtime,is there a way to go faster?
@metahumansdk
@metahumansdk Жыл бұрын
Delay of the animation depends on the length of audio, AVG time for ATL generation is 70% of audio length. If you use ATL Stream in best cases we get a response with the 1st chunk in 3 sec.
@Amelia_PC
@Amelia_PC 2 жыл бұрын
Hey! Seems impressive! But it just doesn't work. I've followed all the steps and still, there's no "DigitalHumanAnimation Content". Tried clean project, version 4.26 and 4.27. Nothing. Plugin Show Content is set on. Without the sample, I can't do anything.
@qntmstudios1214
@qntmstudios1214 2 жыл бұрын
Make sure show engine content is on as well.
@Amelia_PC
@Amelia_PC 2 жыл бұрын
@@qntmstudios1214 Yup, Show Engine content was on too. I've tried everything and had no content generated. (Tried versions 4.26 and 4.27, with different projects: MetaHuman project and a clean project. I'm using Windows 10 X64). Thanks for answering.
@metahumansdk
@metahumansdk 2 жыл бұрын
We added that problem for more deep testing and we'll check it again for all versions. Thank you for your comments.
@Amelia_PC
@Amelia_PC 2 жыл бұрын
@@metahumansdk Thanks :)
@qumetademo
@qumetademo 2 жыл бұрын
How many languages does it support?
@metahumansdk
@metahumansdk 2 жыл бұрын
It's language-agnostic plugin, but the most tested in russian. Stay tuned for more updates.
@aitrends8901
@aitrends8901 2 жыл бұрын
No intention to critic this work but I would expect better fidelity..
@user-jj8ld2dw7k
@user-jj8ld2dw7k Жыл бұрын
Why my voice is out of sync with my mouth
@metahumansdk
@metahumansdk Жыл бұрын
We need more details about your case of use because there is too much scenarios in engine and all of them need their own settings. If it possible share more details in our discord
@hernanmartz
@hernanmartz 2 жыл бұрын
Will it be compatible with UE5?
@metahumansdk
@metahumansdk 2 жыл бұрын
Dear Hernán Martz, yes, we are working on a plugin update and expect it in June
@hernanmartz
@hernanmartz 2 жыл бұрын
@@metahumansdk awesome!! Thanks. 😃
@hernanmartz
@hernanmartz 2 жыл бұрын
@@metahumansdk still waiting for it. 😢
@sarbatore
@sarbatore 2 жыл бұрын
Where is our BP_FaceExample ?
@mischaschaub5864
@mischaschaub5864 2 жыл бұрын
Cannot find it either? Please help
@metahumansdk
@metahumansdk 2 жыл бұрын
Hey! You can find it in "DigitalHumanAnimation Content" folder. Don't forget setting "Show content" in UE
@luiginicastro1101
@luiginicastro1101 2 жыл бұрын
When I right click to generate lip sync, I don't see the option to generate lip sync animations, any suggestions?
@luiginicastro1101
@luiginicastro1101 2 жыл бұрын
Oh my mistake, I didn't notice my audio file wasn't .wav
@sadevanka8542
@sadevanka8542 2 жыл бұрын
Unreal engine version?
@metahumansdk
@metahumansdk 2 жыл бұрын
4.26 or 4.27
@sadevanka8542
@sadevanka8542 2 жыл бұрын
@@metahumansdk i cant download my custom metahuman in unreal engine 5 preview 2, can you help me with this?
@sadevanka8542
@sadevanka8542 2 жыл бұрын
@@metahumansdk im using quixel bridge..
@nextgenmatrix
@nextgenmatrix 2 жыл бұрын
Here is an example of how powerful this SDK is when you put your mind to it: kzbin.info/www/bejne/hWiynX-bisyIias
@stuff7274
@stuff7274 2 жыл бұрын
Damn, that's pretty good stuff !
@_pastras
@_pastras Жыл бұрын
Looks horrible honestly
@VladGohn
@VladGohn 2 жыл бұрын
ребят а вы не могли еще страшнее демо проект сделать? я всё понимаю, но это просто ужас, что нельзя было в своем же метахумане сделать нормальное лицо и свет поставить немного. выглядит просто ужасно, качество звука - ужасно, как документация полностью не годится. спасибо((
@2010Edgars
@2010Edgars Жыл бұрын
No good lipsync. Nobody speak like this.
@metahumansdk
@metahumansdk Жыл бұрын
Dear EdgarsAL_77. Thank you for the feedback. We already complete some improvements for our lipsync animation and every day we try to make generated lipsync better. You can watch our tutorial to compare the result, hope you will enjoy this one: kzbin.info/www/bejne/rqCXaGetbZJnl8U
Tutorial: Metahuman's Head Detachment Fix
6:50
MetaHumanSDK
Рет қаралды 5 М.
Tutorials: how to use the plugin
19:15
MetaHumanSDK
Рет қаралды 42 М.
Llegó al techo 😱
00:37
Juan De Dios Pantoja
Рет қаралды 60 МЛН
Alex hid in the closet #shorts
00:14
Mihdens
Рет қаралды 19 МЛН
Unreal Engine Metahuman Facial Motion Workflow with Faceware
6:52
Feeding_Wolves
Рет қаралды 97 М.
Omniverse Audio2face Metahuman Tutorial
7:54
JSFILMZ
Рет қаралды 35 М.
10 Minutes vs. 10 Years of Animation
19:29
Isto Inc.
Рет қаралды 781 М.
The REAL Reason Unreal Engine VFX Looks FAKE
6:58
Joshua M Kerr
Рет қаралды 390 М.
Make a 3D Character of YOURSELF (Unreal Engine 5 + Metahuman)
9:51