No video

Enjon: 3D/2D Game Engine, C++ / OpenGL DevLog #7

  Рет қаралды 2,609

John Jackson

John Jackson

Күн бұрын

Пікірлер: 25
@johnjackson9767
@johnjackson9767 6 жыл бұрын
For anyone interested, I've created a Discord channel ( link in description ). Feel free to join to keep up to date with progress or ask any questions!
@thehambone1454
@thehambone1454 6 жыл бұрын
Awesome stuff! It always feels good to know that I am not the only working on their own tech :) :)
@johnjackson9767
@johnjackson9767 6 жыл бұрын
Keep it up! There definitely aren't enough people working on their own tech anymore in industry, so keep with it and you'll have no trouble landing a job if you want it.
@matthewwarner3909
@matthewwarner3909 6 жыл бұрын
Hey, I was wondering if you can elaborate more on the metasystem when it comes to IMGUI. For example, if you create a Movement component with a couple of variables in it. how do you go about drawing the component without the user defining how to draw it and then allowing them to define it if they so choose too? My thought was you have a map in some class may be called "EditorManager" which holds the meta id and a custom editor class instance in it that defines how to draw it? I think my confusion comes in when talking about predefined types.... Do i need to make a wrapper class for them or do I just manually fill in meta info for these built-in types? Again amazing job with everything so far!!!
@johnjackson9767
@johnjackson9767 6 жыл бұрын
Since all reflected objects derive from Enjon::Object, I'm able to grab any of their reflection info and then draw properties in any manner that I or a user chooses. So somewhere in any of my editor code, I just simply call this on an object that I want to draw: void ImGuiManager::InspectObject( const Object* object ) { Result res = const_cast< Object* >( object )->OnEditorUI( ); if ( res == Result::INCOMPLETE ) { DebugDumpObject( object ); } } ( Ignore the awful const cast, btw ) What this code does is it will look for a user-defined override for "OnEditorUI()". If one is given, then the user will have the option to draw some piece of editor code anyway that he chooses. I do this myself for specific things as well, including some Entity properties. The user is expected then to return a result back. If this result is the default of INCOMPLETE, then the default behavior will be thrown which is to call "DebugDumpObject( Object* obj )", which does some default property rendering. DebugDumpObject loops through each of the properties in an object's class and then has default behavior for drawing them that I've defined myself. It looks something like this: void ImGuiManager::DebugDumpObject( Object* object ) { const MetaClass* cls = object->Class(); for ( usize i = 0; i < cls->GetPropertyCount(); ++i ) { switch( prop->GetType() ) { case MetaPropertyType::U32: { u32 val = 0; cls->GetValue( object, prop, &val ); Enjon::MetaPropertyTraits traits = prop->GetTraits( ); if ( traits.UseSlider( ) ) { if ( ImGui::SliderInt( fmt::format("##{}", name).c_str(), ( s32* )&val, ( s32 )traits.GetUIMin( ), ( s32 )traits.GetUIMax( ) ) ) { cls->SetValue( object, prop, ( u32 )val ); } } else { if ( ImGui::DragInt( fmt::format("##{}", name).c_str(), ( s32* )&val ) ) { cls->SetValue( object, prop, ( u32 )val ); } } } break; ///.... other types follow } } } Hope that helps!
@vamidicreations
@vamidicreations 6 жыл бұрын
Wow looks amazing! I am working currently also on an engine for my school assignment and you have shown quite good ideas how you do your rendering, material editor etc. would you happen to have a Discord channel where people can ask you for more information about how you did some of your things in the engine? I would really love it if you could maybe share how you did the editor because I was quite struggling with the editor. I use ImGUI_dock, but when I use ImGUI_gizmo it would have an offset and weird matrix problems. So I ended up making a hacky solution by rendering the scene normally and render the editor stuff on top of the scene. that works, but I don't like the solution.
@johnjackson9767
@johnjackson9767 6 жыл бұрын
VamidiGames Thanks! I'll consider putting together a discord channel - that's a good idea. You seem like you have a bit of understanding already, so what specifically about the renderer would you like to know? The material editor itself is simple, but setting up the backend architecture for it was tricky. Because I needed a way for multiple scenes in multiple windows, I had to set up a concept of a world that held contexts ( graphics, ImGui, Physics, Entities ) . Once this was setup, making something like the material editor was a breeze. For the material backend, that's more involved, since I generate all the shader code for custom materials that the user can define. That solution is a valid way to handle that specific case. You could also render it in your docked window on top of you set the clip for the viewport and then project your mouse information from "viewport space" to window space. We used that same widget library at work for our editor, but I never liked it, so I made my own in my engine.
@vamidicreations
@vamidicreations 6 жыл бұрын
Thank you for the quick response! The alternative solution is quite interesting, I will try to write it down and see how I could implement that. May I ask how you are currently doing it? Are you rendering everything in your scene to a framebuffer and render it inside dock? And for the material editor, I was wondering how would you do if some of the object graph (json files) have defined vec3 as a base color, but the other graph has a texture as a base color. How would that pack out, do you end up building your shader file in one? Or is it in that scenario you would then have multiple shader files?
@johnjackson9767
@johnjackson9767 6 жыл бұрын
"Are you rendering everything in your scene to a framebuffer and render it inside dock?" Yes, I'm doing exactly that. Then when I want to interact with the scene within the docked viewport, I have to transform the mouse input from the viewport space into window space. "How would that pack out, do you end up building your shader file in one?" For the final node in the shader graph format, let's say for "base color", it expects to have a vec4 format. It's simple enough to enforce this by doing simple conversions from various types in vec4 before I pass it into the final node for output. So it the user plus a texture RGB channel into the base color input, then underneath I massage that data from a vec3 into a vec4, similar to this: vec4 final = vec4( RGB, 1.0 );, or something like that.
@vamidicreations
@vamidicreations 6 жыл бұрын
Ohw that is really smart, how did you convert to window space? With inverses of matrices? And so if I understand correctly you do the conversion in c++ before passing it to GLSL?
@johnjackson9767
@johnjackson9767 6 жыл бұрын
The conversion has nothing to do with rendering - it's just so I can interact with the objects via the transform widget within a scene. My GBuffer holds an ObjectID buffer which takes the unique id assigned to a graphics renderable, converts it to a color, and then renders that into the buffer. I then query that information when the mouse clicks in the viewport for a scene to detect if any object has been grabbed, including the transform widget itself ( since it's just a collection of different unique renderable objects ). This is how I get pixel-perfect accuracy for any of my scene selections. The actual conversion is fairly easy - just figure what percentage of the viewport the mouse is actually clicking, then use that percentage to scale by the actual backbuffer dimensions. For instance, if I click someone where in the viewport that happens to be 0.5 in the width and 0.3 in the height, then I can use that to normalized information to project into any window size. So if the actual backbuffer size is (800, 600), then the projected mouse coords would be (800, 600) * (0.5, 0.3) = (400, 200). Also, I've created a discord channel for the engine, so feel free to join that and ask any more questions you might have! discord.gg/p3J7qGb
@matthewwarner3909
@matthewwarner3909 6 жыл бұрын
I would love if you could elaborate on how you did your reflection system. There are extremely little resources on how to create reflection data in c++.
@johnjackson9767
@johnjackson9767 6 жыл бұрын
Sure thing. I actually went over it a bit in one of my previous videos: kzbin.info/www/bejne/qX3IfmB4nLCCgKs But the basic setup is that I have a base Object class that all objects that I want reflected derive from. All reflection data for any object class is stored in a "MetaClass" object, which holds member variable data, function data, as well as type information. I've written a precompiler that parses all of my header files for special markup that I add to any reflected objects and it generates all the reflection code for me that is then used in my compilation for the engine and any applications. I got the basic idea from watching a HandMade Hero stream where Casey Muratori discusses writing introspection code for his game. I'd highly recommend watching that as well, since it gives a great working example of how to write a simple lexer and parser in order to generate reflection code. Link here: kzbin.info/www/bejne/Z3rainiAqNuXmdU Not sure if that helps you out at all ( it's hard to really go over the system completely in a KZbin comment ), so if you have any other questions, please ask!
@matthewwarner3909
@matthewwarner3909 6 жыл бұрын
Makes sense. I think the problem I was having was trying to mimic c# to much in that every single type in the language has meta info about it (i.e. int, float, double). I think I am starting to get the hold of things. I am currently actually using libclang python binding for generating the metadata and jinja2 for the template engine. so far it has proven to be very useful and generates meta info similar to the way you have done it. My only question is how do you add the metadata into the meta database before the engine runs? or do you do something else that allows you to lookup objects by string? PS. really appreciate the quick response!
@johnjackson9767
@johnjackson9767 6 жыл бұрын
Sorry, just caught your response - something's up with KZbin where it doesn't notify me reliably if someone has responded to a comment. Anyway, binding all the meta data into my registry is fairly simple - a static function that's called that runs a "bind" process as part of the engine start up. Object classes can be looked up by string, but they can also be looked up by id which is generated and assigned as part of the reflection generation. So when the engine starts and the registry is created, I call this: // Binding function for Enjon::Object that is called at startup for reflection void Object::BindMetaClasses() { Object::RegisterMetaClass< SkeletalAnimationAssetLoader >(); Object::RegisterMetaClass< MeshAssetLoader >(); Object::RegisterMetaClass< BoxCollisionShape >(); Object::RegisterMetaClass< ArchetypeAssetLoader >(); ... // All other classes that are reflected } This function is a part of my generated.cpp file, which is just included as a part of the build process. The engine has its own generated file as well as any user-based application code. And where this is called in the engine startup: Enjon::Result Engine::InitSubsystems() { // Meta class registration mMetaClassRegisty = new MetaClassRegistry( ); // Register all base object meta classes Enjon::Object::BindMetaClasses( ); // Register and bind all application specific meta classes GetApplication()->BindApplicationMetaClasses( ); // Other init processes... }
@matthewwarner3909
@matthewwarner3909 6 жыл бұрын
Okay cool. Looks similar to my terrible attempt. Currently trying to create an ECS based engine since I created a component based engine in college already. Absolutely love your work so far!
@johnjackson9767
@johnjackson9767 6 жыл бұрын
Thanks, man! Would be interested to see what you come up with, so let me know!
@nachocortizo3321
@nachocortizo3321 6 жыл бұрын
Nice progress!
@sc5shout
@sc5shout 6 жыл бұрын
How did you do that it creates new .h and cpp files and within it all those method? Still using Dear Imgui for the UI?
@johnjackson9767
@johnjackson9767 6 жыл бұрын
Hey, thanks for watching. I have template files that I use with some special markup in them that I parse and replace. For instance, within a template for a new component, I'll have some markup that looks like "#COMPONENT_NAME". All I do when I'm generating new header and source files for a new component is to pull in that template from disk, quickly parse and replace those markup keywords with the appropriate information, and then output the files. After successfully outputting them, I can then rebuild the project and reload the dll. I'm still using Dear ImGui but have wrapped most of it for ease of use and cleanliness.
@xgamer-od7mb
@xgamer-od7mb 5 жыл бұрын
keep up the good work :D
@Madsycode
@Madsycode 3 жыл бұрын
is this imGui ?
@johnjackson9767
@johnjackson9767 3 жыл бұрын
Yes
@Madsycode
@Madsycode 3 жыл бұрын
good am also working on my own engine!!
Enjon: 3D/2D Game Engine, C++ / OpenGL DevLog #8
17:46
John Jackson
Рет қаралды 3,4 М.
Header-Only Game Framework in C | Game Engineering
27:20
John Jackson
Рет қаралды 23 М.
这三姐弟太会藏了!#小丑#天使#路飞#家庭#搞笑
00:24
家庭搞笑日记
Рет қаралды 118 МЛН
The CUTEST flower girl on YouTube (2019-2024)
00:10
Hungry FAM
Рет қаралды 39 МЛН
Blue Food VS Red Food Emoji Mukbang
00:33
MOOMOO STUDIO [무무 스튜디오]
Рет қаралды 32 МЛН
Writing a Poor Man's Reflection System in C99
44:01
John Jackson
Рет қаралды 11 М.
Gunslinger: C / Opengl Christmas Stream - Header Only Game Engine
1:15:30
Building the world's LARGEST iPhone
32:05
DIY Perks
Рет қаралды 257 М.
Gunslinger: C / Opengl Dev Stream - Graphics Work
2:23:01
John Jackson
Рет қаралды 2 М.
What happens 67M pixels up in Noita
0:49
Yolksyb13
Рет қаралды 13 М.
Dungeon Crawler Game in Java for CS211
4:18
Nano-AI
Рет қаралды 62
Gunslinger: C / Opengl Dev Stream - Wrapping Up
48:10
John Jackson
Рет қаралды 581
Rasterizer Algorithm Explanation
5:18
HuCE - cpvrLab
Рет қаралды 81 М.
Fractals Attempt in Python
1:18
Nano-AI
Рет қаралды 16
这三姐弟太会藏了!#小丑#天使#路飞#家庭#搞笑
00:24
家庭搞笑日记
Рет қаралды 118 МЛН