Рет қаралды 57,728
Is it possible to train Midjourney? so you can create the same character from different angles and perspectives. This is how to do it. The secret to training Midjourney AI, stable diffusion tutorial. And how to do character design with the help of Midjourney, DALL-E and Photoshop.
SUBSCRIBE
▶ handle: / @levendestreg
▶ You can subscribe to our channel here: kzbin.info...
▶ Read more: levendestreg.dk/en
▶ Download cheatsheet: levendestreg.dk/en/how-to-tra...
-------------------------------
Links
▶ Documentation: midjourney.gitbook.io/docs/us...
▶ Midourney's website: midjourney.com/home
▶ Once logged in click this link to see your images: www.midjourney.com/app/
▶ sketchfab.com/tags/head
▶ • DREAMBOOTH: Train MULT...
▶ monstermash.zone/
▶ huggingface.co/spaces/pharma/...
-------------------------------
This video is created on the Wacom Cintiq Pro 32 with Macbook Pro, Atem Mini Pro, Midjourney, DALL-E, Premiere Pro, After Effects and Photoshop.
00:00:00
Can you train Midjourney? I mean train it so that you can create the same character in different poses and from different angles? And if that’s really possible - why aren’t everybody else doing it?
00:00:19
And a lot of you wonderful people have asked about the possibility of somehow training Midjourney in what results you want. Or training it in different characters - or styles that you want to create.
00:00:56
Remember there are links in the description below among other things to a FREE CHEAT SHEET where I share some of my prompts with you.
00:01:25
What AI is the best for my needs? What AI can I get the best results with? And most importantly how can I train AI to create the character or style that I want to create?
00:02:09
But is that possible to achieve that with Midjourney? Well yes and no.
00:02:20
As the guidelines for Midjourney state - you cannot feed images directly into Midjourney like that due to concerns about community public content.
00:02:30
Instead Midjoueny lets you use images as inspiration - the img2img feature, usually along with text, to guide the generation of an image.
00:02:50
And as I explained in this video about character design
00:03:09
First let’s see if we can get Midjourney to give us the cropping and content we want
- the right part of the image, so to speak.
00:03:53
I’m going to give you seven different shots to choose from. There are a lot more. You just need to google. But for now, we’ll take a look at seven.
00:04:14
Now first we have the Full shot. Then the Medium full shot, Cowboy shot, Medium shot, Medium close-up, Closeup and Extreme close up. And then you have to choose your angle too.
00:04:54
So now you know the basics of what to ask Midjourney for. The trick here is to actually mention in what angle you see the face in - for instance. And I’ve had good results with mentioning details on the eyes too, so Midjourney doesn’t mess up the eyes.
00:06:05
To get the remix settings switched on you simply write /settings - and toggl on remix.
00:06:48
The term seed sets the seed for an image. So, when you use the term seed, it means that Midjourney will use the same noise or diffusion to create your image from.
00:07:04
And using seed can sometimes help keep things more steady and easier to replicate when trying to generate a similar prompt again or getting the same sort of image.
00:07:15
But - it can also be used for creating the same character in different poses, from different angles and perspectives.
00:07:53
Now how do you find the seed for the image you want then? Well, you add a reaction to the image you want to get the seed from. And you do this by right-clicking on the image.
Then you press “add reaction” and then “other reaction”. And here you just start to write env - and that’ll bring up the envelope.
00:08:14
Now Midjourney will send you a private message. And here you get the seed number.
00:08:47
And as you can tell - all of a sudden I have a character to work with. It’s magic.
00:08:55
So, what is same seed? You use the term -sameseed to affects all images of the resulting grid in the same way.
00:09:13
But if you use same seed - the four images will use the same slice. I can you use DALL-E’s paint out tool and Photoshop to do corrections.
00:09:29
If you want to learn more about DALL-E and the paint-out tool well, I’ve got an upcoming episode on that topic.
00:10:16
Because is there an easy way to turn your design into 3D models? The fast answer? Yes - and no! So, of course Adobe has some amazing tool to create 3D models.
00:10:31
And in general 3D has a steep learning curve. But of course there are always a shortcut or two that you can take.
00:10:41
So if you go to monstermash.zone/ and upload a 2D image - you can actually turn it into a 3D model - in a fairly simple way.