Made a B3D Loader

Announce new projects or updates of Irrlicht Engine related tools, games, and applications.
Also check the Wiki
RichardT
Posts: 4
Joined: Sun Aug 13, 2006 2:44 pm

Post by RichardT »

Maybe that's how Fredborg designed it? I'm a registered Gile[s] user too. I like that program for 99% or what I do.

Now I'm in the process of learning to code with C++ and trying to learn Irrlicht too. Sometimes I feel like I'm eyes-deep in the mire. :D
"Do or do not. There is no try." - Yoda
Luke
Admin
Posts: 449
Joined: Fri Jul 14, 2006 7:55 am
Location: Australia
Contact:

Post by Luke »

RichardT:

The loader (for now) only looks for the textures in the same folder as the mesh its loading, (blitz3D does a similar thing). It seems to me a lot better then ignoring the folder the mesh is in which I think is what some Irrlicht mesh loaders do. Why cannot you put the textures in the same folder as the mesh?

A thing that blitz3D did that bugged me was load textures from absolute paths (eg “C:\whatever\car.jpg”) when your game was in completely different path, so your game would be fine on your computer but then strangely lose half the textures on any other computer..
I don't know if that will help you, but I'm looking forward to the plugin so that I can use the models I create in my programs done with Irrlicht. This is one of them, a screenie from Gile[s].
I’m not sure what plugin you’re talking about, the blender one? Irrlicht 1.1 should be able to load that mesh if you just change the blending mode to alpha in giles, like I said to CK_MACK.

BTW, Blitz3D was written using the DX7 SDK. You might be able to look in there to see how DX uses the alpha, multiply and additive modes for rendering. You can get it at:

yeah, but Irrlicht is designed in run with DirectX and OpenGL, and I’ll have to change the driver code you add it, I think its best left to an Irrlicht team developer.



Vermeer:

There’s heaps of blending modes for light-mapping that’s no problem, but that’s just blending two textures together. I need blending options for the other meshes behind it, (different ways the meshes are see though)


Anyone:

On the same subject it would be nice if there was a value in the Materials, ‘Alpha’ that worked with (most/some) of the MaterialTypes.
Eg: EMT_TRANSPARENT_ADD_COLOR is good for lasers but you cannot control its alpha to fade it out. EMT_TRANSPARENT_ALPHA_CHANNEL_REF is good for trees by most trees leaves look better with a bit of alpha, which you cannot do.

I know the drivers cannot do alpha with some MaterialTypes and when its possible with the drivers (and been added to Irrlicht) MaterialTypeParam controls the alpha but it seems like a pretty restrictive system.
RichardT
Posts: 4
Joined: Sun Aug 13, 2006 2:44 pm

Post by RichardT »

@ Luke: Re - 'plugin'

Sorry, misreading again. I get in a hurry and read things too fast. I'll grab your loader and use it with Irrlicht.

I've been wishing that B3D had the shader capabilities and all the other 'gee whiz' stuff from DX 9. That's why I'm trying to learn how to use Irrlicht, so that I can take advantage of the HSL features, etc.

Apologies.
"Do or do not. There is no try." - Yoda
Luke
Admin
Posts: 449
Joined: Fri Jul 14, 2006 7:55 am
Location: Australia
Contact:

Post by Luke »

RichardT:
I'll grab your loader and use it with Irrlicht.
You know the loader is already included in Irrlicht 1.1

I've been wishing that B3D had the shader capabilities and all the other 'gee whiz' stuff from DX 9. That's why I'm trying to learn how to use Irrlicht, so that I can take advantage of the HSL features, etc.
You can use shaders on the b3d meshes like any other Irrlicht mesh, but the shaders are not loaded from the mesh. Personally I think using shaders for extra graphics is the least important thing to think about when making a game, shaders don’t make or break a game.
juliusctw
Posts: 392
Joined: Fri Apr 21, 2006 6:56 am
Contact:

b3d documentation

Post by juliusctw »

hello luke

I have been looking into my b3d runtime animation. I think all i need are the (joint name, transition matrix, time took of transition) Since it is reallly difficult to generate these numbers accurately for like a jump sequence, i think it would be easier for me to animate them in blender export into a b3d format. I can then write a little script to extract the necessary information. I would only need to do this once, and all other models will follow the same sequence.

The problem is that I have no clue how i would go about extracting these information. I looked for b3d docs on line, but didn't have much luck. Do you have any suggestions on how I would go about extracting the information from the b3d files??? :D
Luke
Admin
Posts: 449
Joined: Fri Jul 14, 2006 7:55 am
Location: Australia
Contact:

Post by Luke »

Juliusctw:


I’m not sure what you’re trying to do.

Are you trying to extract the animation of the b3d file or just data?

If it is just data like joints names download a demo of unwrap3d (http://www.unwrap3d.com/index.aspx) and the b3d exporter/importer from the same site. It demo won’t let you save and the plug-in doesn’t always load b3d well, but it’s a handy program.

If its animation you want extracted, tell me, there are few methods but I’m not sure why you need to extract the animation anyway, there are easier ways…
vermeer
Posts: 2017
Joined: Wed Jan 21, 2004 3:22 pm
Contact:

Post by vermeer »

I have Ultimate Unwrap.

Sorry, Julius, havent checked that email for long. Will now.Too busy.I forgot.

Yup, if u said what ur aiming to, julius, we'd help better, as I personally know a bunch of tricks, besides have experience in i/o with ultimate unwrap, between the b3d and x formats, passing from one to another back and force, etc...There are some tools for handling that, but apart from Ultimate unwrap, not many...Probably best bet is handle by code...just mention yourpurpose if is not secret or something, so we can avoid you too much extra coding or the like...or maybe point you -luke will know- to some available routines or something.

Or something.


;)
Finally making games again!
http://www.konekogames.com
vermeer
Posts: 2017
Joined: Wed Jan 21, 2004 3:22 pm
Contact:

Post by vermeer »

julius, sent you a "tiny" comment for help u in texturing with blender.
Finally making games again!
http://www.konekogames.com
juliusctw
Posts: 392
Joined: Fri Apr 21, 2006 6:56 am
Contact:

Post by juliusctw »

hello luke and vermeer

i just need animation info to manipulate the joints in real time to simulate walking, running, etc.... My plan was to use an existing B3d files and exact that info. And use that info to animation rigged meshes.

I was going to buy a couple motion capture, and somehow exact the info i need and apply it to "all" my meshes.



________________________________________________
here's my plan

-buy or find execellent motion caption animations
-export the animated version to b3d
-i need to know the animation info and convert it into data so i can
use the data to animate a nonAnimated version in real time.

-once i get the animation info, i get rid of the animated mesh
-i will install all non-animated but rigged
-use the data collected to animated the rigged mesh in real time

i want to do it this way so i only have to do the animation once and each b3d file wil be smaller.
Last edited by juliusctw on Sat Sep 23, 2006 5:25 pm, edited 1 time in total.
juliusctw
Posts: 392
Joined: Fri Apr 21, 2006 6:56 am
Contact:

thank you vermeer

Post by juliusctw »

thank you vermeer, i will probably spent the next entire month studying art and all the links you have given me. Before I start, i have several questions.

you mentioned that not all formats supports UV mapping, which format "does" support UV mapping, i would like to know specifically for obj , b3d, x?

you have mentioned "smooth group", but i still don't know what that means.

After reading your pm, you didn't seem to mention the technique I read from books,
1. export UV map to gimp
2. Color in a base color layer
3. Put in the brightness layer, (is that also called light map? )
4. Put in the bump map layer
is this a valid approach?

I keep hearing about LSCM for blender, but it seems that there are several other UV mapping approaches, such as painting directly on the mesh. How many ways are there in blender? Which one do you recommend?

And lastly, I have finished my University Campus (3ds format), I would like to put automatic light in there. I think i can do it in irrEdit. My question is, how should the light be position? Is there a general technique? How many should I have? I don't know much about lighting techniques. :D :D
vermeer
Posts: 2017
Joined: Wed Jan 21, 2004 3:22 pm
Contact:

Re: thank you vermeer

Post by vermeer »

juliusctw wrote:thank you vermeer, i will probably spent the next entire month studying art and all the links you have given me.
One tip, start from thos I labeled as basic...maybe is better...tho not in order.Your first step is uvmapping *not* texturing.
Before I start, i have several questions.

you mentioned that not all formats supports UV mapping, which format "does" support UV mapping, i would like to know specifically for obj , b3d, x?
Was mainly to point you out the fact not all formats support everything, as well as not all blender features can be exported.
In the case of OBJ, B3d, X, all support uv mapping. B3d is probably the most complete of these formats. My everlasting issue was no free tool exporting character animations in b3d...But Luke is solving that, as Blender is all power for character animation...

OBJ is great for statics, but imho...b3d is gonna be much a great deal. B3d support lightmaps, as well as multitexturing. Most of all because it supports multiples uv channels. Sadly, Blender only has one, but there are a pair or 3 free tools for generating lightmaps.

you have mentioned "smooth group", but i still don't know what that means.
Smoothing normals in my non deep knowledge on the internals, is a way in whcih light gets distributed in real time. It allwos that flattened surfaces look all smooth without increasing polycount. As the edges get smoothed, like blended visually. Art side, you can generate visual discontinuities in this smooth surface, whenever it's interesting: ie , the a real life sharpe edge, like edge of an axe, or a more clear case, if you want clothes sharp borders in some places. Or if want the mouth look like it has the tyipical crease.
All b3d, obj, x, support this.
just all not packages do. Blender will in a build that I think is already being done...

After reading your pm, you didn't seem to mention the technique I read from books,
1. export UV map to gimp
Oke, this is way after. This you start to do *once* you have uvmapped the model, usually you will do in Blender the UVs.
2. Color in a base color layer
3. Put in the brightness layer, (is that also called light map? )
No.You are texturing it. Lightmap is -usually- having another procedure. You will have texture map, you will have lightmap. Those are different bitmaps, and differnt workflows.

A model with its material can be uber complex.
But basicly , a game model should have:

- Not too much vertices to be used in real time 3d.
- Have smoothing groups (smoothing normals) applied.
- then have UVs (uv map, is NOT abitmap) for this model. Is an internal info that the mesh receives. The only bitmap to appear in this stage is that once you have made your UVs, you export the UV LAYOUT as a bmp or tga for REFERENCE, for painting your texture. Once it becomes a nice texture in Gimp, of course, you will apply it as the model texture, asigning it to the material difuse map. This works so in every package out there, tho the buttons or names vary.
-As I am telling you, you export from blender that UV template bitmap, and load in gimp. Then you do this procedure you describe of maing a layer with color, and above one with , surely you mean, lights and shadows but refering to starting to draw like in greyscale in mutlply layer mode or similar, so it will affect bellow color only layer as a drawing volumes generation so to make your drawn texture...

Or maybe what you have read somewhere, is that one can do a multiple layers gimp document, and paint there, so that they match, the base texture in one layer, and then based on that as reference, make a greyscale layer that would be specular maps, another greyscale one for generating standard bump maps...(not normal maps, tha'd be more complex)

But no. Forget about so much complexity. First concentrate on doing a simple texture for the model.A game model can do just with that if well done.
Only this: you export the uv template bitmap from blender, and paint over in Gimp(but i advice you use free photos first, just editing like a collage, is easier). I'd recomend, in a layer over it.

4. Put in the bump map layer
is this a valid approach?
Forget the bump for now. besides, using a bump with elegance is not easy at a first time ;)

I keep hearing about LSCM for blender, but it seems that there are several other UV mapping approaches, such as painting directly on the mesh. How many ways are there in blender? Which one do you recommend?
well..the problem is you yet have some confussion in your mind about what uv mapping is , and what is actually painting-texturing. Are to sepearate procedures, done with different tools.

Deep paint3d 1.6 allowed to do the Uv maps in the same interface, but is sorta non practical imo.And anyway, was actually an uvmapping software embeded rawly into the painting application.

Dont confuse painting (which you can do in 2d (over the uv template) , or with 3d painting. In both cases, you are painting pixels in a bitmap, not generating uv coords, that step has to be done before. And in both cases, you actually paint over the 2d template. Just with 3d painting, you visuallypaint over the 3d model, tho you can go back and force to gimp to do 2d touches instead of 3d strokes.)

So, for the texturing, is easier 3d painting, but usually the tools are no way as accurate and good as in Gimp or Photoshop, for this of actual painting.I havent tested yet the new -texture- painting of Blender.Am afraid it'd be useful mainly for a mark up of where stuff goes so its easier later on in 2d paiting tool.

And lastly, I have finished my University Campus (3ds format),
3ds format often gives probs with smoothing info, I think as it does not support it well...it actually breakes the mesh wherever 2 uv coords are found in a single 3d mesh vertex. You could convert the 3ds files to OBJs or x, or b3d. But if those 3ds files already look good to you, ok then.Anyway, levels usually look more or less ok with that format.
I would like to put automatic light in there. I think i can do it in irrEdit.
Oh, ok. Yup, was forgetting that. Yup, definitely, go for it, is your best choice. You refer to lightmaps. And yes, funcionality already inside irredit.
One thing less to worry about ;)
Indeed, you can import whatever the mesh from whatever, as Niko as put then levels stuff way much easier thanks to that.

My question is, how should the light be position? Is there a general technique? How many should I have? I don't know much about lighting techniques. :D :D
If you refer to lightmaps, you put how much lights you want and in the way you want, it'll calculate the lightmap depending on that. The lightmap is a greyscale map that makes the level look like lit and shadowed, giving ambience, volume, realism. A well lightmapped level is all in the game ambience and feel.So, position, lights, light colors, etc, is all up to your scene and wish.
Finally making games again!
http://www.konekogames.com
Luke
Admin
Posts: 449
Joined: Fri Jul 14, 2006 7:55 am
Location: Australia
Contact:

Post by Luke »

Juliusctw:


Sorry for the delay, been busy,

Ok, first if you’re going to be animating meshes with data your going to need to make something to play the animation back from that data, not too that hard, but it’s a bit of work. But it can be faster (in cpu) then my method if you do it correctly and optimise it well.

My method would be to keep a animated (possibly skinless) b3d hidden in the background and copy over the joint data as it animates, to the real mesh, it you don’t understand how this would work, (or don’t think it would work well) then tell me, and I go in to more details.

Back to what you asking, One quick and dirty method of getting the animation data would be to call setCurrentFrame (which works now) and go though each frame (jump in steps of 100 frames with b3d meshes) and record the position and rotations of the joints. The disadvantage to that is that you ‘bake’ the animation without getting any extra animation info.

Anther way is to make the animation data from the b3d loader public maybe by making a function in the loader to access the data. Or you could just copy the whole loader code and strip unneeded bits. (be careful, as b3d are in binary, one missed byte read anywhere will reck the loaded data)
then you can copy the animation from it and do what ever you like, the animation data is stored in each joint/bone.


Vermeer:

About the b3d uv mapping, it seems blender can have different uvs per vertex, I made the exporter support this awhile ago by cloning vertex smarty (doing the minimum possible clones, a headache to make) and it seems to work well, and should be just as good/fast as if the format fully supported it. Should I have a button to turn cloning UV off or is it easy to change to single uv per vertex in blender?

I’ll also have a button to export normals from blender, it seems from your posts you want this?


I have not worked on the export much recently because I’ve be busy and its hard to be motivated when its just bug hunting…, I'll work on it soon.
juliusctw
Posts: 392
Joined: Fri Apr 21, 2006 6:56 am
Contact:

Post by juliusctw »

Hello luke

I think your way with another animated mesh would work, tell me more :D


I have another question, let's say i make a pair of pants for my character, i wanted the leg to be a seperate mesh so i could just switch it, but you said i should superimpose the pants, but i don't see how that's more beneficial? I could animate the new pants exactly the same way as you just suggested with a background animation and just attach it to the body
Luke
Admin
Posts: 449
Joined: Fri Jul 14, 2006 7:55 am
Location: Australia
Contact:

Post by Luke »

Juliusctw:


Did you edit you post? because before you where talking about user created animation, my way wouldn’t work with edit the animation in the game. Are these animations are created in your game or with some tool of yours because that in its self is a big project. I was watching your video on google it seems like a massive project, I hope you know just how big it is.
I think your way with another animated mesh would work, tell me more
There’s lots I could talk about, could you narrow it down, anything you don’t understand?

I have another question, let's say i make a pair of pants for my character, i wanted the leg to be a seperate mesh so i could just switch it, but you said i should superimpose the pants, but i don't see how that's more beneficial? I could animate the new pants exactly the same way as you just suggested with a background animation and just attach it to the body
By superimpose I presume you mean add the pants to the leg’s texture.

Yes if you very careful you method could work but first you have the overhead of a leg under the pants (more ploy to render, and more vertices to animate), second you somehow have to rig the leg and the pants to move exactly the same way when they animation so the leg doesn’t go though the pants (and keep that relationship as you update the meshes with better ones, etc), and finally its more work to program.

What’s wrong with the texturing method, anything you want it cannot do?
juliusctw
Posts: 392
Joined: Fri Apr 21, 2006 6:56 am
Contact:

Hello

Post by juliusctw »

hey sorry

I did change my post, originally i thought it won't work, but i realized that I have enough to do as it is. Your way may not be as versatile, it would do for now.

Yes, I know exactly how big my project is. I have a reputation to finish gigantic projects. This is actually one of the smaller projects I have done. (writing my own book took 14 years). This one would only take a couple of years, and I already finish a quarter in 1/2 year. I left my day job to do this full time so my progress is fast. There are only 2 people on the team so i don't need to hold meetings. Lukily, this just may be the most lucrative project. (Lucrative enough for me to have investors)

I have to take this chance to thank you for the b3d loader cus I had my project standardized to md2, and it wasn't very good. I needed skeletal manipulation. And no matter how vermeer helped me, i could not for the life of me export in .x format. If my project works out, i'll definitely attempt to hire you. :wink:


Ok back to business

I think you mis understood me cus you mentioned leg under the pants. With my method that won't happen. Let me explain another way.

I want to have a bunch of head meshes
body meshes
arm/hand meshes
leg meshes

they are all completely seperate mesh that are designed to fit perfectly with mix and match.

when a player is picking the character, they can mix and match them. When they are done, I would have the body mesh as parent and attach leg, arm, head, etc to that mesh.
so they are all completely seperate mesh grouped together.

Ok, now let's say I wanted to add a body armor.

I would simply remove the current body mesh and replace it with the armor mesh, and re parent the body parts.

Now i want to change pants,
I would replace the current leg mesh with the pants mesh

This way, the leg is completely removed and replaced by the pants

the pants mesh would have the same skeleton name as the leg mesh, so the animation calls would work on them exactly the same way. Since they are attached to the body, the translation movement is done automatically.

Ok, so this is what I need as help with b3d. I need to know if my understanding of your source code is correct to impliment your approach.

This is how i would impliment your approach, When the game is loading, it would load in a animation mesh.
I would animate it using setFrameLoop()
then i'll get the jointMatrix getMatrixOfJoint() at each frame
i will store this into an animation class
I will do this with every animation
remove the node

then the game would start

when a character in game needs to animate, I will call upon the class at each frame and use
AddMatrixToJoint()

that's the general idea of how i would impliment it, can you tell me if my understanding is correct?


And the second thing i really need help on is to understand how you change the texture dynamically. I must have missed it in your code, i didn't seen any methods for it.
Let's say i draw a girl mesh, but prepared 3 skin tones, how do i switch between them???
Last edited by juliusctw on Thu Sep 28, 2006 6:29 pm, edited 1 time in total.
Post Reply