Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Blender idTechX map editor #29

Open
BielBdeLuna opened this issue Jan 20, 2016 · 65 comments
Open

Blender idTechX map editor #29

BielBdeLuna opened this issue Jan 20, 2016 · 65 comments

Comments

@BielBdeLuna
Copy link
Member

all things pertaining to the Blender Map idTechX map editor here, so I stop hijacking other's discussions with my crappy program :)

my repository: https://github.com/BielBdeLuna/Blender_IDtech4_GPL_map
which I will keep updating the "alpha" branch until I feel i have a master to push

current version: 0.0.4

here is my plan:
export brush data without correct texture placement 0.0.1--DONE
export surfaces without correct texture placement 0.0.2--DONE
export scaled data 0.0.3--DONE
export material name 0.0.4--DONE
export rotated data 0.0.6
figure out a way to export with correct texture placement 0.1.0

export vertex wheighted data in order to open it with idTech X 2.0.0

@ghost
Copy link

ghost commented Jan 21, 2016

It's not crappy, it's way better than my "Hello World" program.

@BielBdeLuna
Copy link
Member Author

:)

@ghost
Copy link

ghost commented Jan 22, 2016

Can't get it to show up in the add-ons or export panels. When running the script from the text block, I get no errors or any output from the console. This is on Blender 2.76.

@BielBdeLuna
Copy link
Member Author

check out in the "testing" section

@kortemik
Copy link
Member

I wonder if you have had time to work on the format object structure? if there is a good resource for the format like there was for md5mesh i'd like to read it through too to understand how map format works only thing i have is https://modwiki.xnet.fi/MAP_%28file_format%29

@BielBdeLuna
Copy link
Member Author

I'm still working on it.

Do you happen to understand how the texture matrix on that page is defined?

@kortemik
Copy link
Member

Well I think one needs to understand the whole of a brush to understand how
the definition is designed:


(plane equation) ( ( xxscale xyscale xoffset ) ( yxscale yyscale
yoffset ) ) "material" ?

// brush 2
{
brushDef3
{
( 0 0 1 -0 ) ( ( 0.00390625 0 0 ) ( 0 0.00390625 0 ) )
"textures/layout/floor02" 0 0 0
( 0 1 0 -256 ) ( ( 0.00390625 0 0 ) ( 0 0.00390625 0.1875 ) )
"textures/layout/floor02" 0 0 0
( 1 0 0 -448 ) ( ( 0.00390625 0 511.96875 ) ( 0 0.00390625 0.1875 ) )
"textures/layout/floor02" 0 0 0
( 0 0 -1 -8 ) ( ( 0.00390625 0 0 ) ( 0 0.00390625 511.9375 ) )
"textures/layout/floor02" 0 0 0
( 0 -1 0 -256 ) ( ( 0.00390625 0 0.0625 ) ( 0 0.00390625 0.1875 ) )
"textures/layout/floor02" 0 0 0
( -1 0 0 -256 ) ( ( 0.00390625 0 0 ) ( 0 0.00390625 0.1875 ) )
"textures/layout/floor02" 0 0 0
}

Is a parallelogram right, it's defined with planes, not by it's points.

We only have in the plane equation the orientation of the surfaces, first
three are positive and the last three negative, meaning all of them are
facing outwards and as they are different in x y z they are all different
sides of the parallelogram.

The last value of plane equation is the plane offset from the map origin.

However, we now only know the facing of the planes, not anything about
actual planes, right?

No more values in plane equation part to use for defining these either, and
after that we have the texture matrix.

The trick here seems to be that the texture matrix or whatever one likes to
call it also defines the plane size.

x-offset and y-offset are offsets from x- and y-axis in texture-file. This
means one always will have a rectangle clipped out from the texture-file.

This rectangle however might be made of any size texture-file, and does not
necessarily have one to one size in real brush. Therefore one scales the
texture rectangle, meaning the resulting scaled rectangle is the plane
definition of our brush.

By this way of defining it I'd say if one was to make a Tetrahedron, the x-
or y-axis offset*scale would be zero then for making the required triangles in each plane.

I might be totally wrong but this is first time I am looking into this.

-Mikko

On Sat, Jan 23, 2016 at 11:37 PM, Biel Bestué de Luna <
notifications@github.com> wrote:

I'm still working on it.
Do you happen to understand how the texture matrix on that page is defined?


Reply to this email directly or view it on GitHub
#29 (comment)
.

@BielBdeLuna
Copy link
Member Author

the plane equation part is solved the question are those "xxscale xyscale" and "yxscale yyscale", and also where is rotation?

let's remember that you can offset the texture, rotate the texture, and scale the texture and since all happens per brush face (per plane) the texture is projected on that plane (i guess taking into account the 0,0,0 world origin as the 0,0 of the texture in the projection space)

It should be far easiers just having "xoffset" "yoffset" "rotation" "xscale" and "yscale" 5 numbers instead of the current 6 (!)

the plane equation is found this way: you have to convert the blender object to mesh, then find the faces (polygonal faces, those faces that are coplanar (all vertices are in the same plane)) you calc the normal of the face, then you get the first vertex of the face ( any face at least have 3 vertex, and it always have a vertex[0] ) you calc the distance between the face plane and the world origin (0,0,0) using the vertex as a point in the plane
so you end up with the 2 variables idTechX needs:

a normal and a distance: so those are 3 float nums for the normal and 1 float for the distance (that's the "plane equation")

@kortemik
Copy link
Member

I would assume the rotation is the facing value, this particular object was
aligned to all the axis so it's values were -1 and zero zero etc but these
all would have a value on each row if it was not?

For the plane sizing I'd almost say that it's depending on texture equation

On Mon, Jan 25, 2016 at 2:50 AM, Biel Bestué de Luna <
notifications@github.com> wrote:

the plane equation part is solved the question are those "xxscale xyscale"
and "yxscale yyscale", and where is rotation?

let's remember that you can offset the texture, rotate the texture, and
scale the texture and sicne all happens per face (per plane) the texture is
projected on that plane (i guess taking into account the 0,0,0 world origin
as the 0,0 of the texture in the projection space) it should be far easiers
just having "xoffset" "yoffset" "rotation" "xscale" and "yscale" 5 numbers
instead of the current 6 (!)


Reply to this email directly or view it on GitHub
#29 (comment)
.

@BielBdeLuna
Copy link
Member Author

but planes have no size, they are infinite, so the brush object out of the collision of infinite planes which amongst them create a concave 3d object.

if you want the texture of the object rotated 10 degrees anticlockwise from the standard rotation how would you express it?

if there are 6 numbers in order to express the texture coordinates (which seems that could be expressed with just 5 numbers) this could mean that there there to fight precision loss.

@BielBdeLuna
Copy link
Member Author

straight from the archived version of Doom3world that the wiki page points to:

on what is a texturematrix:

"It's a 2x2 matrix that the UV coordinates of the brush are transformed with. So it controls the rotation and scale of the texture on that brush side."

"it's actually a 3 x 2 matrix."

"The first set of numbers is the texture placement on the face in relation to 0 0 0 of the world.

--edit--
wait I've found a code (out of a post about Radiant having it) that I could implement without knowing what actually does:

/ compute back the texture matrix from fake shift scale rot
// the matrix returned must be understood as a qtexture_t with width=2 height=2 ( the default one )
void FakeTexCoordsToTexMat( float shift[2], float rot, float scale[2], vec_t texMat[2][3] )
{
texMat[0][0] = scale[0] * cos( DEG2RAD( rot ) );
texMat[1][0] = scale[0] * sin( DEG2RAD( rot ) );
texMat[0][1] = -1.0f * scale[1] * sin( DEG2RAD( rot ) );
texMat[1][1] = scale[1] * cos( DEG2RAD( rot ) );
texMat[0][2] = -shift[0];
texMat[1][2] = shift[1];
} 

the post: https://web.archive.org/web/20080308170646/http://www.doom3world.org/phpbb2/viewtopic.php?t=16228

it seems to combine scale and rotations within the 2 first numbers of every number group, and the last of the numbers in the group are just for offsets, this matches the definition of the map specification in the wiki!

@BielBdeLuna
Copy link
Member Author

I'm pretty much finished on the the object creation side, since I don't have a map creator system working on Blender I have it geared in a half hackish way, now I need to test it.
As it is right now, I could have entities owning certain brushes which is the base of trigger entities or brush based doors. since now everything are objects oriented code.

I'm looking forward at how to implement entity creation in Blender, there is a crucial step here, which is when or how to link entities to the actual brushes and patches. when to create the Worldspawn object, how to link every brush and patch that is not owned by another entity or with unspecified owner to the Worldspawn entity

and much more in the future automatize a way to read what entities your mod/game uses and how to interpret the entities that your mod/game doesn't use but that exist in the map, for this we will need a python idTechX API which I tired to do, but it's damn difficult.

soon I'll post my improvements to the code following @kortemik's suggestion of using object oriented coding.

@BielBdeLuna
Copy link
Member Author

oops... I've broken the script, now it works, the *.map structure looks like any regular map, but every brush have degenerated planes! and patches fail too! this is too fubar! :)

@kortemik
Copy link
Member

have you had any other obstacles with objectifying?

@BielBdeLuna
Copy link
Member Author

no it's mostly done, but I somewhat broken patches, they where perfect until I crated the classes the code should do the same but somewhere there is a think that breaks it. idTechX complains about the width but it's ok it's a 3X3 matrix as it should be. maybe it's about precision, maybe those numbers have to be float even if they are "n.0" I'm testing this at the moment, as in the map definition, the patches looks correctly constructed.

@kortemik
Copy link
Member

kortemik commented Feb 1, 2016

please push it on another branch and i can take a look, don't be scared to
publish unfinished things :P

On Mon, Feb 1, 2016 at 4:05 AM, Biel Bestué de Luna <
notifications@github.com> wrote:

no it's mostly done, but I somewhat broken patches, they where perfect
until I crated the classes the code should do the same but somewhere there
is a think that breaks it. idTechX complains about the width but it's ok
it's a 3X3 matrix as it should be. maybe it's about precision, maybe those
numbers have to be float even if they are "n.0" I'm testing this at the
moment, as in the map definition, the patches looks correctly constructed.


Reply to this email directly or view it on GitHub
#29 (comment)
.

@BielBdeLuna
Copy link
Member Author

I've pushed the changes to the alpha branch of my repository: https://github.com/BielBdeLuna/Blender_IDtech4_GPL_map/tree/alpha

at the moment patches are not exported because their export is annulled in the entity class, you just need to change a False for a true there and you'll get patch export

@BielBdeLuna
Copy link
Member Author

added the Blender test map I'm using for my tests

@BielBdeLuna
Copy link
Member Author

@kortemik have you seen the code and activated the patches? have you get the patches error after loading a map with patches?

@kortemik
Copy link
Member

kortemik commented Feb 9, 2016

sorry i've been quite busy :/ would you have the test case .blend files? I
mean creating my own would be just a waste of time as it's way better to
have a common set of cases we work on. Perhaps you could create another
branch or even a repository containing them

On Mon, Feb 8, 2016 at 1:51 PM, Biel Bestué de Luna <
notifications@github.com> wrote:

@kortemik https://github.com/kortemik have you seen the code and
activated the patches? have you get the patches error after loading a map
with patches?


Reply to this email directly or view it on GitHub
#29 (comment)
.

@BielBdeLuna
Copy link
Member Author

I added my blend file as I stated two posts above: https://github.com/BielBdeLuna/Blender_IDtech4_GPL_map/blob/alpha/ref/maps/test_level.blend

check out also the material names for the objects, those are the names of the texture material shader for the whole object. it will have to change in the future, in order to have multi-textured objects, but at the moment it's the 101'th hack in the code... :)

@BielBdeLuna
Copy link
Member Author

BielBdeLuna commented Jun 10, 2016

After much battling with it I've come to the conclusion that and idTechX editor package in Blender is pretty impossible, specially because the python scripting as it's used in Blender is far limiting that what I envision: https://www.blender.org/api/blender_python_api_2_77_1/info_overview.html

this part sums it up my restriction to the point:

This is intentionally limited. Currently, for more advanced features such as mesh modifiers, object types, or shader nodes, C/C++ must be used.

I wanted to do a GUI editor and a Material editor, do it with the nodes, but as this states there isn't access to the nodes. also I find python very script-like, the module environment very limiting to simple short scripts, the modules make it unnecessary difficult in the intercommunication amongst each other.
At the moment I was creating a "idTechXed_context" object (so every workspace, whether you're creating a level, a model, a GUI or a material, have the UI accommodate to the workspace accordingly), but then integrating it with the UI classes of blender got difficult due the way blender incorporates external python classes and it's UI (which it's mostly done in c++). also the documentation of what can be done or not in blender python scripting in the UI is not enough to give me the confidence that I won't have a showstopper afterwards when I already have lots of code done, and eventually I think we would be better off changing a c++ project like Darkradiant than battling the limitations of the implementation of python as script in Blender

I would, like to make a material editor but I might prefer to do a visual editor in-engine with the actual target renderer (now that Robert has improved it might be great), maybe have a way to put a cube or a model in which we could put a texture on it so we could control it's textures from the "material editor" and test effects with animation of variables and such, and such, and such.

like how you can test animations by adding a rigged model in which you control it's animations, but this time a static or a rigged model, or a brush cube, which can be rotated, and in which you control it's textures, and also an orbiting light so you can see the effects of lights from all angles

so:

  • modelling would be still be done in Blender with the modelling exporters/importers
  • mapping with Darkradiant
  • GUIs with a GUI editor (is there any GUI editor, maybe a copy of q4?)
  • materials with a visual editor
  • sound either in Darkradiant or a sound editor in-engine

sorry to disappoint guys

@damiel
Copy link

damiel commented Jun 11, 2016

Eventho i just started digging through the project a couple weeks ago, i would like to thank you for your work.

It could have been a great thing if everything worked out and i definitly would like to have it seen in action, but maybe its better if we focus our limited resources, after all there aren't that many doom people (left). Dark Radiant might not be the best looking thing in the world right now, but i wonder what we could make out of it if we put some more resources in it, as its development is quite dormant at the moment(i think its only greebo working on it right?).

As far as GUI Editors go, i think we only have SEED as an cegui editor, i think @kortemik can say more about it.

@BielBdeLuna
Copy link
Member Author

BielBdeLuna commented Jun 11, 2016

yes, but I strictly reefer here to the in-game GUIs, I'm looking forward to add back the fullscreen GUIs as in the old Doom3, and make them work with c++ menu code, so we could have flash/GUI/ceGUI working so we could restore old functionality to gain functionality

it has been some tormenting time under the damn python code, now my feeling is of freedom!

@ghost
Copy link

ghost commented Jun 11, 2016

No problem about the Blender attempt, thank you for at looking into it. It might be better to stick to Radiant. Quetoo has an in-engine material editor I think, I think it also allows level editing,, I'm not too sure, haven't tried it. Cegui at the moment is a pain to work with, it might be better in the future when added features come. A visual material editor would be nice, maybe one that mimics something like AwesomeBump/CrazyBump that can generate normals, AO, spec maps etc from a diffuse texture, then write out the textures and generate the proper shader script. Unrelated, I can't get alphas to work on png images.

@RobertBeckebans
Copy link

I solved the problem of parsing the .def files in Blender by exporting them to JSON as a console command so they can be read easily by Blender during a map import.

The following script can import entire Doom 3 maps converted to JSON into Blender:

https://github.com/RobertBeckebans/RBDOOM-3-BFG/blob/map-primitive-polygons-for-blender/blender/scripts/addons/io_import_rbdoom_map_json.py

Imported d3dm1 map: https://www.dropbox.com/s/w7a39wfxyxj993r/Screenshot%202016-05-31%2015.33.19.png?dl=0

It works fairly well although creating the meshes from a 20 megs map file is rather slow.

@BielBdeLuna
Copy link
Member Author

looks great, I might just continue work in the map exporter as script seeing that I still can't use darkradiant, but first I will test Motorsep's implementation of in-engine radiant.

but mark my words, python will be again a let down, it's just a script language even if they say it's a full programming language, specially the implementation in blender

@BielBdeLuna
Copy link
Member Author

BielBdeLuna commented Jun 12, 2016

@robert
what is a json map? have you added an in-engine exporter to that json file format?

@ghost
Copy link

ghost commented Jun 12, 2016

@BielBdeLuna Do you have a link to motorsep's project? I've been wondering what he has been up too.

I've seen some posts on the Blenderartists forums about cython being used in Blender, maybe it could help speed things up, or maybe it's just a limited API in Blender. I'm guessing a Blender fork for just idtech would be too much maintenance, unless the UpBge guys would be interested.

@BielBdeLuna
Copy link
Member Author

pretty neat then! 👍

@BielBdeLuna
Copy link
Member Author

BielBdeLuna commented Jun 15, 2016

Motorsep ported Tools are just for Windows, they rely heavily with MFC tools (which is an API connection between c++ and Windows) so Linux and Macos Users are out, so Motorsep solution for tools is no solution, now I'll try old versions of Darkradiant

@motorsep
Copy link

Eeh, it's a solution for people on Windows, which is majority of developers. Also can run it under WINE on Linux until someone (which will not going to happen in our lifetime) will port the tools to Qt.

Darkradiant is a level editor. Particle Editor it comes with is not as good as Storm Engine 2 improved Particle Editor. Not to mention GUI Editor, DoomScript Debugger, etc.

Just saying...

@ghost
Copy link

ghost commented Jun 20, 2016

I think some people have found it easier to retool MFC to wxWidgets on Linux. I might be wrong.

@motorsep
Copy link

Or that. Although I see how DarkRadiant port to wxWidgets has been going and I don't think it has been a smooth ride.

Also, someone already ported Light Editor to Qt. So there gotta be base code in place that connects engine with Qt. Would be easier to build upon that.

@motorsep
Copy link

Or you could just use Storm Engine 2 and continue improving it, since it's already more stable, more featureful and performant than OpenTech ;)

@ghost
Copy link

ghost commented Jun 20, 2016

I think I downloaded SE2 the other day, haven't gotten around to trying it, I've been playing around with DarkPlaces and Xonotic lately. Work is keeping me quite busy, I still need to finish the models and animations, and get exporting to work in Blender. The iqm format doesn't seem to bad though. Just MD5 headaches. Would be cool if the engines could support Q3/RTCW bsp versions. I think some modders from the Q3 scene would like that to move projects over to the newer engines, although DarkPlaces already shows how good Q3 maps can look on Quake 1.

@motorsep
Copy link

Xonotic is going to abandon DarkPlaces soon. They are moving to some Q3-based engine (Unvanquished uses it).

I think the strength of idTech 4 compare to other engines is being all real-time - no extensive map compiling needed (takes hours to build Q3 maps with good lighting and can't edit anything in real-time), everything (unless C++ rebuilding is involved) can be tweaked and tuned while playing the game.

At this point there is no benefit of IQM over MD5 (talking about when someone wants to implement IQM into idTech 4). Engine doesn't support a few extra features IQM offers and Blender add-on is barely supported (used to be actively developed, bug fixes were provided as soon as they were reported; that's no longer the case). Plus UI of the add-on is horrible.

@motorsep
Copy link

The core issue with working on all of these FOSS engines is if no one is making any games using it, what's the purpose of developing and polishing such engine?

At this point I think either someone needs to take Storm Engine 2 and make any project using it, or Storm Engine 2 needs to be adjusted to run Doom 3 BFG content so it could be used for modding, or @RobertBeckebans and Co. could pull out good stuff from it and make RBDoom 3 stable engine that runs equally well on Nvidia, AMD and Intel (SE2 does, at least on Windows) and allows for modding without any hacks/tricks.

@ghost
Copy link

ghost commented Jun 20, 2016

I didn't know Xonotic was changing the engine, no one on the irc channel mentioned it to me so far. Maybe a shift to a modded ioQuake3. Open Arena is also changing to some modded engine with an asset overhaul. My only issue with Xonotic and DarkPlaces so far are the bots, many bots (8-16) hit my quad-core quite hard.

I know what you mean by the lightmap build times with Q3map2, especially with higher resolution maps, but produces some nice results. I prefer real-time though, would just like some realistic ambient bounce light and bleeding.

I also agree that their is no point in putting effort into many FOSS clones of the engine and no one ends up using it to make games. I think a lot of people are drawn towards Unity these days, especially since it is easier to work with for small indies on a budget, UE4 and Crytek requires some powerful systems.

The drag and drop nature of modern editors also pull more people, compared to the daunting workflow of working with Radiant. In-game level editing also seems to be becoming popular, I've noticed a trend among some indies to have some form of in-game level editor with Unity for example, then saving it out to json.

@motorsep
Copy link

That's one of the reasons I abandoned FOSS engines (I have a small project in works using Steel Storm as a base, but once/if it's done I don't think I ever want to touch Darkplaces again) and moved to UE4.

@ghost
Copy link

ghost commented Jun 20, 2016

I'm beta testing UT4, however my system is ancient by todays standards, would like to get into UE4 once I buy a new rig, just waiting for Zen and to see how it performs before deciding.

@motorsep
Copy link

Mine is ancient too, but the prospects of actually getting project finished outweigh a bit sluggish development process :)

@BielBdeLuna
Copy link
Member Author

I've been looking into the Steel Storm2 engine, and it has a lot of good features we could port.

on the engine tools, we have to develop them, it's the most under-developed of the parts of the BFG derived engines, besides that, the engine is quite capable of corridor shooters like doom3, a lot has to be improved in order to make open world games, but this could come in the future.
A milestone from mine, is to restore all the functionality the doom3 engine had (which at the moment to me means, restore fullscreen GUI, and OGG playback, as well as at least have a working mapper for the engine)

@ghost
Copy link

ghost commented Jun 21, 2016

I've been chatting to someone on the iodoom3 irc, busy learning to make assets for Dhewm3, I told him/her about OTE, they might with assets as well. We both talked about the possibility of a coop styled game similar to Left 4 Dead. Both of us are working on cyberpunk themed assets, so we may work together. The assets i'm working on is currently aimed at a small single episode game, similar to Doom 1 shareware, maybe it can be reworked for coop.

So far I have had no luck to get Storm Engine 2 to open maps, it just sits on the loading screen, the console does open with a message saying it failed to open map "x".

I'd like to have the old console back, just for now until CEGUI is ready. I still need to figure out how to add GUI elements to entities such as panels and the weapons. OGG is a must.

@motorsep
Copy link

@yetta1 SE2 is a bare bones engine. It won't load any maps until you add basic defs, player and basic scripts. Same as Doom 3 engine if you strip it off content.

I only put enough content so that you can run engine and get into menu. That's all. The rest is up to developer to add their defs, etc.

Btw, why are you so attached to OGG?

@BielBdeLuna
Copy link
Member Author

with Storm Engine 2 i can't help, because it segfaults when loading up.

OGG should be fine for music and long ambient tracks

@BielBdeLuna
Copy link
Member Author

for guis on models you have to have one of the gui textures in the model and then add a "gui" spawnarg with it's key as the path to the gui file, like this:

"gui" "guis/maps/crane_control.gui"

then the GUI will appear in the GUI texture in the model, for multiple models I guess there are at least up to 3 gui textures, and then maybe use the spawnargs "gui1" "gui2" "gui3" (but I'm not sure about this)
in any case you can have as many as you want func_mover or func_static(every one of them with GUI surfaces) and bind them to the moving structure controlled by scripts

@motorsep
Copy link

@BielBdeLuna Did you debug to see why it crashes? Are you on Linux using FOSS driver? (which we chose not to support because people have a choice and can choose to use drivers that work)

BFG engine has its own compression (similar to MP3) for audio. I don't see any benefit for having OGG. Just more external libs and more potential incompatibility and issues.

@ghost
Copy link

ghost commented Jun 21, 2016

@BielBdeLuna I'll have to take a look at how the gui elements were done on D3 for ammo depletion etc. I remember something about the texture from an old tutorial from a decade ago.

Does BFG still use the same gui system from D3 for the weapons or is it flash?

Doesn't ffmpeg already give access to audio formats like vorbis, aac etc?

@motorsep
Copy link

Yeah, BFG uses old GUI stuff for anything done in-game. Flash is only for full screen GUIs.

@ghost
Copy link

ghost commented Jun 21, 2016

Ok that's good, not looking forward to using flash for anything.

@BielBdeLuna Off-topic: Isn't there a console command to generate lighting from the skybox for a lightmap?

@BielBdeLuna
Copy link
Member Author

no, there's nothing related to lightmaps in the current implementation of the engine

Although Sikkpin implemented a image based renderer for ambient light (using skybox as environment probes) now this was done with ARB shaders and I don't remember if there where c++ code released form it

@ghost
Copy link

ghost commented Jun 21, 2016

Would it possible to add a lightmap system like the one found in ioQuake3 to bake with Q3map2? Then maybe just get lighting from the skybox mix with rt lights. Or how did Rage get its daylight?

@BielBdeLuna
Copy link
Member Author

with source at hands it's possible, but it could be difficult, in terms of shaders a lightmap it's just another term to add to the shader mix, but with lightmaps you go back to an old technology, with it comes everything becoming static (like in rage) I think it's the wrong way to go, but we could have an implementation ready in the engine in case someone would want it.

@BielBdeLuna
Copy link
Member Author

BielBdeLuna commented Jun 21, 2016

I think we should make a document with all the ideas we agree to implement

@ghost
Copy link

ghost commented Jun 21, 2016

Baking lightmaps with Q3map2 can take long, however even if the deluxel technique seems old, it is still in use with modern engines. I've noticed it with UE4 and Unity, where it bakes lightmaps during mapping, which can be annoying as it slows down performance when those features are on. Static lightmaps for daylights and ambient fill can help performance.

Trying to fake daylight in D3 just looks unnatural, the ambient light textures also don't help much as it is limited and drowns out most normal map effects, giving a more diffuse texture looking effect.

@ghost
Copy link

ghost commented Jun 21, 2016

Maybe a skybox probe to bake to lightmap volumes, then areas without the volume can go really dark. Otherwise a skybox+lights+distance=bounce light/radiosity.

@BielBdeLuna
Copy link
Member Author

that's the reason to use the environment probes that Robert is/was working on, at the moment is an unfinished implementation, but the idea is the have a static ambient light system much more realistic than the static ambient lights in vanilla doom3. now this system should both allow to have a more realistic ambient light and also static more correct reflections

@ghost
Copy link

ghost commented Jun 21, 2016

Looking forward to it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants