![]() |
Click any of these to go to the relevant section: |
![]() |
---|
Script usage
General
Whenever you start with a newly created scene, make sure that you
save the file before you export. The script extracts a name for
the export directories and files from the blenderfile, otherwise
the settings will be saved using the name of the last blendfile
you saved. This is only needed for NEW files.
If you start the GUI without having saved a new scene, all the
buttons will use the settings of the last saved/loaded
file.
Since the python API does not yet allow full
access to all Blender parameters, things like
nurbs/surfaces/metaballs & normal materials/textures can only
be imported using the 'Import from Blendfile' option (see
Blendfile import).
If you don't use this option, only polygonal meshes can be
exported. You can't export Nurbs, Surfaces/Curves, or Metaballs
without converting them to regular meshes first (use ALT-C). This
does not mean the script will fail if you have these in your
scene, they just won't be exported.
Don't forget to set the correct export layers if you want to hide certain objects from your scene.
It is no longer necessary to only use Targa (TGA) images when you use UV-textures (or other material textures when using Blendfile import), the script will convert them automatically for you. This is done by one of the Python extensions (totga module). The converted textures will be put in a new directory called 'TGA_Textures' inside your main scene directory. This will work for JPEG, IRIX and PNG (supported in Blender Publisher) textures.
You don't have to use the '_ASXX' object name
extension anymore when all you want to do is enable/disable
smooth shading, this will now be automatically done for you, you
only need to use the 'Set Solid' or 'Set Smooth' buttons in
Blender. This will work the same way as in Blender, you can set
some faces to smooth shading, while others are set to use solid
shading. The Lightflow result will be the same.
Of course you can still use the '_ASXX' name extension to control
the shading, this works exactly the same as in Blender when you
have 'AutoSmooth' enabled. Whatever value the 'Degr' button is
set to, use the same value for the 'AS' extension (AS =
AutoSmooth). So when the 'Degr' button is set to 30, append
'_AS30' to your object name. Always use two numbers, if you use
values less than 10, use a zero for the first digit. So if you
want to use the value 9, the extension would look like
'_AS09'.
Note that highly angular meshes (like cubes) that have smooth
shading enabled (without using AutoSmooth) will tend to show
black artefacts in the Lightflow render, for these cases you will
have to use AutoSmooth. Keep in mind that AutoSmooth (especially
for large meshes) can take a while to calculate.
Multiple materials on a single object are now supported as well. To optimize this, the script does quite a bit of checking to make sure only the necessary information for Lightflow is exported. So this too can slow things down.
If you use instanced copies of objects (ALT-D key, linked object copy), they will only exported once, no matter how many times it occurs in the scene, the render will be correct nevertheless.
Always make sure that lights (arealights too) don't
intersect any other objects, don't place lights exactly on a
surface but slightly in front of it. So for an arealight patch on
a ceiling, don't place the light exactly on the
ceiling, but slightly below it.
This is necessary otherwise you will get shading errors.
Arealights are now one-sided, so remember to make sure that your
arealight normal is pointing the right way (to your scene).
If you are using 'Radiosity' and you don't see the
expected colorbleeding effects, try enabling 'Caustics' as
well.
For some reason Lightflow will not always calculate indirect
illumination, for example when the camera is outside of the main
scene (further than all other objects).
Exporting a single frame
Set the Export menu to
'Export this frame'. Change any other parameters as you wish, and
select 'Export'. The script will then only export the current
frame.
Exporting an animation directly
from Blender
Set the Export menu to
'Export & Render entire anim'. This is the only way to fully
export deformed meshes. Use this when your animation makes use of
armatures, RVK, or lattice deform. Optionally (but strongly
advised that you do so, see remarks in the main export screen section), you can
set the 'Mesh files' to either 'Check if it exists first' or
'(ANIM) Export selected only', this last option is only available
AFTER you selected either one of the animation options from the
export menu.
Another optional setting only available after setting the Export
menu, is '(ANIM) save then load' in the 'Shadow & Radiosity
data' menu. You can use this when you do a fly-through animation
where only the camera is moving, Lightflow then needs to do
radiosity & shadowmap calculations (when using shadowmapped
lights, see the Light types section) only
once. This can considerably shorten render times.
Exporting an animation as a single
Python file
Set the Export menu to
'Export single file anim'. This will export a single python
script to do the rendering for you without Blender. The main
disadvantage is that you can't export animated deformed meshes.
Everything else works normally. You will have to execute this
script outside of Blender, so Blender does not have to be
running.
You also can use the '(ANIM) save then load' option from the
'Shadow & Radiosity data menu' here, however this does not currently work with radiosity
data!
Lightflow gets stuck on the second frame,
and might eventually crash. So you can only use this for
shadowmap data here.
This is the first screen you will see
when starting the script for the very first time. The script will
need some information to work correctly, but you only need to do
this once, everything you specify here will be saved.
When the script is started for the first time, the script will guess a few values, based on the system it is running on, so it might be (if your lucky) that you don't need to specify anything else but the texture directory.
!!! WARNING !!!
Don't use spaces in any file or directory names, the script will
not continue if you used spaces in any of the path names you need
to specify here.
The first directory the script needs is the name of the directory where all exported scenes will be saved, the obvious default being 'LFexport'. This one is mandatory.
If you use Windows, then the screen will look as you see here to the right.
The script will need to know where the python executable is located (python.exe for windows, python or pythonX.X on Linux, where X.X is the version). If you installed python correctly, that is you have set the PYTHONPATH environment variable, you should not have to change anything here. But just in case it is incorrect, you will have to specify it yourself. Use the "TEST PYTHON" button to test if the path is correct. If possible, Python will be executed and the version number should appear in the dos-console/terminal.
In case you use Linux, there will be an extra bit next. What will be displayed there is a menu where you can choose the browser you want to use for browsing the documentation:
There are four options, after making a choice, you can use the 'TEST BROWSER' button to test it, the information/credits page should then be displayed.
The available options: are "Default" This is the default browser, which usually means it is a textbrowser like Lynx, "Konquerer" (spelled this way in python...) the KDE browser, "Mozilla" and "NetScape".
With the next button (also available in Linux) you can specify the name of the text-editor you want to use to edit .py files. Use the button next to it to see if you can start the editor.
Next button, only available in windows, is to specify where MATSpider is located. If you don't use MATSpider you can ignore this again. Since the script needs a directory name, you will have to type it in. You can test if it is the correct directory by using the 'TEST MSPIDER' button.
And finally, the script needs you to specify the location of
your main texture directory. This one is also mandatory.
After you have done all that, and you think everything is
correct, you can save everything with the 'SAVE ALL' button.
Unless you did not specify the export- or your texture-directory,
or used spaces somewhere, the script will continue to the main
screen which you will see from now on whenever you start the
script again.
You can still change things afterwards if you want, by using the
'PREFS' button at the top of the main export screen.
This is the
main GUI.
PREFS
button:
This button brings you back to the screen above, where you can
redefine any directory or other settings if you wish to do
so.
Shadow & radiosity data
menu:
With this menu you can make the script save or load the radiosity
calculations and lamp shadow maps. This is very handy when your
doing an animation and all that is moving is the camera. Since
nothing of the lighting changes in that case, it would be a waste
of time if Lightflow would recalculate everything when it starts
a new frame. In all other cases you should leave this at the
default of "Don't care". There are three standard options: "Don't
care", "Save" or "Load". And when you want to export an
animation, there is an extra option "(ANIM) save then load",
which will make everything automatic, first frame is calculated
fully, the next frames will re-use the calculated data.
!!! WARNING !!!
The "(ANIM) save then load"
option when used with "Export single file anim" currently does
not work properly with radiosity data, Lightflow will start to
render the second frame, but suddenly stops and does not respond.
So when you want to use this option with radiosity, at the moment
you have no choice but to render the animation from
Blender.
Mesh files menu:
This menu can be used to specify if you
want to make the script explicitly recalculate and export every
mesh. This is only really necessary when you use animated
deformed meshes (armature, lattice, RVK), in all other cases
(mesh only changes location,size or rotates) the mesh itself only
needs to be exported once.
There are two standard options: "Always write", which will as it says always recalculate and export, and "Check if it exists first", which will only export when the mesh has not been written yet. A rule of thumb is to use "Always Write" as long as you still have to make edits in editmode (TAB key) or are editing uv-coordinates in faceselect mode (F key), otherwise when you are done with that and are experimenting with materials or lights for instance, or you are doing any edits out of editmode (like rotating, positioning, scaling), you can use "Check if it exists first".
A third option will be available when you have specified that you want to render an animation from blender:"(ANIM) Export selected only". This option you can (and should) use whenever you have a combination of deformed meshes and normal meshes. The script will then only export everything when rendering the first frame, and after that only the meshes you have selected, which of course should be the deformed meshes.
There is a reason why you should use this. Unfortunately, Blender has a 'bug' that makes very inefficient use of memory. Full export of meshes is 'expensive', meaning every time the script exports a mesh, some memory gets 'lost', it will only be returned to the system after you quit Blender. This means that every frame will use more and more memory and take longer and longer to export and might actually make it impossible to fully render an animation, so whenever possible, especially with large meshes, when rendering an animation from blender, use this option.
So again, when you do use this option, before you start rendering the animation, select all objects you want the script to always export fully, which means any meshes that have RVK, Armature, or lattice deform.
Radiosity button:
Use this to toggle radiosity
calculations. (RADIOSITY?)
Caustics button:
Use this to toggle calculations for
caustics. (CAUSTICS?)
The 'What to export?'
menu:
This menu decides what is actually exported.
There are three options: "Export this frame", which exports only the current frame, which you can render from Blender.
"Export & Render entire anim", which makes it possible to render the complete animation from Blender.
"Export single file anim", which is used to create a single script which will render the animation outside of Blender.
The differences between these last two options is that you can only export deformed meshes correctly with the "Export & Render entire anim" option, the other will also export the mesh of course, but the deform is static, meaning that an animated character for instance, will be exported in the pose of the first frame. Only changes in rotation, size or position are exported. Another difference is that the export .py file is an ACTUAL python program, it is not just data for Lightflow. To make it easier to edit materials in that case and not be confused by python statements, the material definitions will be exported to a separate file called "ANIM_MATERIALS.txt". So if you want to edit the materials, you will find them there, don't try to edit the "ANIM_whatevername.py" file, unless you know what you are doing.
World Light button:
This button will put a
sphere around your scene that will be used as a pseudo
lightsource. It only works with radiosity enabled. The results
can be similar to what became known as the "Arnold look". This is
not an actual feature of Lightflow but really more of a hack that
will work with quite a few other raytracers too. It is not
necessary to have any actual lights in your scene, in fact,
rendering will be a lot faster without any lights!
When you have enabled this, a number of other buttons will be added to the GUI:
With the RGB sliders you control the colour of the surrounding light. These sliders go beyond the value of one, this is done so you can make the light stronger if you find the result too dark.
You control the size of the lightsphere with the Size button. Make it at least as large as your scene.
Below that you will find the texture button, here you can
specify the name of a texture which will be applied to the
lightsphere. To make it easier you can use the "FSEL" button to
use Blender's file-selector (not available in Publisher). You can
use this to fake HDRI lighting, this is not real HDRI but can
look similar. You can use any texture of course but the textures
that will actually look like there is an environment around your
scene and so work best, are textures in spherical format like
used in panorama viewers. You can get a few from my site (as well
as a program to create new ones with Terragen) on the HDRIBL
page
The color buttons can still be used to alter the color and
brightness of the texture, which is often needed, since the
result tends to be too dark. To alter the brightness without
changing the color, you will have to set all three color sliders
to the same value.
Render button:
This will bring you to
the Render screen where all rendering takes place.
LAYERS button:
This will bring you to
the Export Layers screen to enable/disable the Blender layers you want to
export.
Edit .py button:
With this you can call up a
text editor with the last exported .py file, so you are directly
able to edit the python file if you need or want to do that. The
editor you specified in the paths/preferences screen will be used
for this purpose.
Imagers/DOF button:
This button brings you to
the Imagers/DoF screen
MATSpider buttons:
!!! ONLY AVAILABLE
IN WINDOWS !!!
Also only available if you
specified the MATSpider directory. With these buttons you can
view (as well as edit) all the materials available in the
MATSpider "Library" directory. For more info see MATSpider tutorial
Blendfile Import button:
This is a very recent
addition that needs it's own documentation: see About Blendfile import
Documentation menu:
Not surprisingly, this
allows you to view the documentation.
There are four options to choose from:
Image Size menu:
This menu allows you to
manipulate the render image size from the GUI.
The default option is "FROM BLENDER", which will directly use the
settings from Blender. This is not available in Publisher, in
which case the default will be "4:3 -> 640 X 480" full size
(100 %).
The other options are "4:3 -> 640 X 480" for the standard PC screen aspect ratio, "16:9 -> 640 X 360" for a simple wide screen preset and "Custom", which will allow you to set the image xy-resolution to any value.
Additional control is offered with the screen size percentage menu, with which you can quickly quarter (25 %), halve (50 %) or double (200 %) the resolution, plus some additional settings.
More Parameters button:
This button brings you to
the Render engine parameters
screen
Reset All button:
This button will reset
EVERYTHING (except preferences) to their default
values.
Trc depth slider:
This controls how many bounces a reflection
or refraction ray is allowed to make. For example, if you created
a scene with a reflective sphere resting on a reflective plane
and then set this value to 0, no visible reflection would render.
With a value of 1, you would see the sphere's reflection in the
plane, and vice versa, but would not see anything reflected in
the reflection itself. The larger this value, the more bounces
are calculated, and the longer it takes to
render. So when you have a lot of reflecting and/or refracting
objects in your scene, don't set this too high (the slider goes
up to 16), at a certain point the reflections/refractions won't
even contribute visibly anymore to the end result. So to keep
rendering times reasonable, set it to a value as low as you can
get away with.
Rad depth slider:
In a similar way to the
"trace depth slider" above, this value controls how deep specular
(caustics) and indirect (diffuse) light "bounces" are calculated.
If this value is set to zero, no global illumination or caustics
are calculated, regardless if you switched radiosity/caustics on
or not. So for this to work correctly, the value should be at
least one.
Radiosity samples:
This button controls how
many samples are used for the global illumination calculations.
In other words, how accurate the calculations are. The higher
this value, the smoother and better the GI will look, but also
the longer the rendering times.
Photon count:
This button controls how
many photons are used to estimate the global illumination. The
higher, the more accurate the lighting calculations.
PhDiffuse:
This button controls the
clustering value of photons used to calculate diffuse (indirect)
lighting. Basically this means how many groups of photons are
used to make an estimate of the final GI solution.
PhCaustic:
This button controls the
clustering of photons used for caustics. If you get a lot of
artifacts in your image, increase this number. If you find the
result too smooth, lower this value. Sometimes less perfect
results can actually look better though.
If you find these last two buttons hard to
understand, you will find a good explanation of a similar
parameter used in YafRay:
2.- How to tune a photon light
The parameter is called 'radius' there, but basically is the same
as Lightflow's PhDiffuse & PhCaustic parameters.
Anti-Aliasing parameters:
Smp1 and Smp2 both have a strong effect on rendering times. Unless you know what you are doing, it probably is better to leave both alone. For fast previews it is easiest to just set Thold to 3.0, which means anti-aliasing is switched off.
RENDER PRESETS
MENU:
This is a quick way to set certain parameters to values for a
certain situation. Options are: "User settings" which means you
control everything, "Defaults" which resets all render-engine
parameters to default values, "Fast Preview" which will set
everything to the fastest possible options (no shadow,
displacement mapping, or volumetric calculations and no
anti-aliasing), and the remaining self-explanatory options: "Fast
Caustics", "Fast Radiosity", Fast Caustics & Radiosity",
"Better Caustics", "Better Radiosity" and "Better Caustics &
Radiosity", "Precise Caustics", "Precise Radiosity" and "Precise
Caustics & Radiosity", and last but not least "mrmunkily's
Caustics & Radiosity.
The "Fast" and "Better" presets have a reduced anti-aliasing
quality setting. The "Better" presets are already quite an
improvement on the default settings, which might be enough for
most situations. The "Precise" presets are very slow, and might
not be that much better than..."Better". Note that all of these
are just some guessed general settings, so don't expect to always
get perfect results when choosing any of the "Precise"
options.
"mrmunkily's Caustics & Radiosity" is the exact settings as
used by mrmunkily to render his well-known mech-spider.
This is not really set to render radiosity as it only uses a
single bounce, which means it really renders shadows only. To
really do radiosity & caustics you will have to set the
radiosity depth slider higher than 1.
The presets might be useful as quick initial settings and then
making further adjustments based on what you want to do. You can
still make changes to the parameters set by simply setting the
menu back to "User Settings" after you have selected any of the
other options.
Exit button:
....do I really need to
explain this one???
Well, ok, just one thing, you can also use the Q key if you want,
which will do the same thing...
Export button:
This button will start the
Export procedure, in case anything goes wrong for some reason,
there will be a message above the "Exit" button, and a more
precise description of what went wrong in the DOS shell/terminal
(then again, maybe not if it happend to be an internal error). In
some cases it will only be a warning, like when there are no
objects or lights exported. I have tried to make these errors as
self-explanatory as possible, so most of the time you won't see
any typical python errors as you might be used to from other
scripts.
This is the screen where you can enable/disable layers for
export.
The number buttons enables/disables each layer individually. With the "All Off" button you clear every button, while the "All On" button enables every layer. Note that you will have to enable at least one layer before you are allowed to return to the settings screen with the "Settings" button. |
![]() |
Here you can add different imagers to the final render result. Some Lightflow imagers are postprocessing effects like the fake Depth-of-Field (out of focus) or glitter effect, while "Halo" and "Film" are calculated while rendering.
Depth of Field:
There are two methods
available, "fake" DoF and "real" DoF. "Fake" DoF is an actual
imager (a blurring filter based on distance), while "Real" DoF is
calculated.
To be able to use Depth of Field, you need to add an "Empty" object to your scene, change it's name to "FOCUS", and move it to the point where you want everything to be in focus (sharp).
For Fake DoF:
The "Radius" parameter sets the radius of the blurring filter.
Larger values mean a more exaggerated blurring.
The "Mask" parameter sets the maximum allowed number of pixels
used for the blur filter. This also means that if you have radius
set quite high, it might be necessary to increase this value too.
Otherwise the blurring will not be as strong as you might expect.
However, it is unlikely that you would exceed the default value
of 20.
Enabling the "AutoFocus" button makes Lightflow calculate the
necessary distance values, experimenting with this, it seems to
be calculated from the closest to the furthest object from the
camera. If not enabled, the script will use the distance value of
the "FOCUS" Empty to the camera.
Use the "REAL DOF" button to enable real Depth of Field calculations.
The second row of
buttons will be replaced by a single button called "Aperture",
with this you set the aperture of the camera lens. Larger values
mean more blurring, but since the blurring is done through
jittering, the larger this value, the noisier the end result.
Like Lightflow area-lights it can be quite slow to render. This
also depends on the anti-aliasing settings, if you disabled
anti-aliasing, the result will be calculated much faster, but
also a lot more noisy. So to get the best possible quality with
this method when using a large aperture, you might need to
increase the anti-aliasing settings as well.
Halo Imager:
This imager will add halo's
to (bright) lightsources that are visible to the camera. Don't
confuse this with the "halo" option of Blender spotlights. What
is meant here is the effect you get when a camera is directly
looking at a bright light. It is more like a primary lensflare, a
sort of glow around the light. Something you would maybe create
in Blender with the 'halo" material option.
The "Halo" button enables/disables this imager. With the "Reflection" parameter you specify how much of the light is reflected in the camera lens. The "Smoothness" parameter sets the smoothness factor of the camera lens, the higher this value, the larger the halo's. The "Shinyness" parameter controls how strong the halo's are. The higher this value, the stronger the halo's. You need to set the "Shyniness" parameter to a non-zero value if you want to see any Halo's at all.
Glitter Imager:
This imager adds a similar
effect as the halo imager, but instead of lightsources the effect
is added to any object which has a material that has glittering
enabled. By default the script will automatically enable this in
every material whenever you use this imager. So all the default
script materials can produce glittering. The glitter will occur
around points where the light is brightest.
"Radius" specifies the radius of the glittering, so how large it will appear. "Intensity" controls how strong the glitter is, the higher, the stronger. And the "Threshold" value sets the point above which a colour will produce glittering. This simply means that the higher this value is, the less likely it is the glitter will occur.
A use for this imager could be a simulation of the effect you see when a bright sun shines on a wavy water surface on a hot summer's day.
Film Imager:
This imager will simulate
the behaviour of optical film of real camera's. The imager uses a
similar exposure function as used by Terragen, but you can't
control the actual exposure. The only parameter you can change is
the "Grain" value, this sets the amount of film grain (a noise
type effect). Unfortunately, this noise is static, meaning that
the grain will move with the camera when animated, making it look
more like a grainy/dirty lens. Another possible disadvantage is
that if there is a black background, you won't see any grain on
that, no matter how high it is set.
The 'QuietDune' Blender sequence plugin (to be released) might be
a better choice for this effect.
The "Settings" button returns you to the main GUI.
This
GUI allows you to set various parameters that allow advanced
control over the "default" render engine. You really need to know
what you are doing to make best use of this.
Option
switches:
The "DISPLACEMENT", "SHADOWS" & "VOLUMETRICS" buttons
enable/disable the calculation for ...hmmm..... what was it
again, oh yes! displacement mapping, shadows and volumetric
rendering.
All of them are enabled by default.
Radiosity
parameters:
Here you can control the precision of global illumination
calculations.
These buttons are actually quite important to control the
radiosity precision, it is not enough to just increase the
"samples" value in the main screen.
The "Stratified Smp" button enables stratified sampling. Stratified sampling is a slower but more accurate and controlled method than Lightflow's "default" sampling method.
Lightflow calculates radiosity through a method that takes into account that indirect lighting changes very slowly throughout the environment, so instead of actually calculating it everywhere, it only calculates it every so often, and makes use of this to "guess" what happens in between. The next parameters control how often these actual calculations are done.
The "Threshold" value detemines the maximum allowed error (in
percent) for these calculations. Settings this lower than the
default of 20% already can improve radiosity calculations
considerably, the "Better" render presets have this value set to
1% for instance.
The "RUD" (reuse-distance) parameters control when exactly new
information is sampled from the environment in terms of screen
distances. If you were to set all of these values to zero, you
would get the best possible quality since then Lightflow REALLY
calculates global illumination for every pixel. Suffice it to say
that this takes considerable amounts of time. For more
information I refer you to the section in the Lightflow docs.
Lighting parameters:
These parameters allow you
to control the precision of lighting calculations. The
"Threshold" value controls shadow accuracy for arealights. Again
see the Lightflow docs for more
information.
Radiosity for _GLAS & _METL
materials:
These two buttons allow you
to decide wheter or not you want radiosity calculations to
involve these two materials. By default only _METL affects
radiosity calculations.
The reason this is not always on by default is that both Metal
& Glass most of the time don't really contribute much to
diffuse lighting, so if this would always be enabled, calculation
would take much longer while the result would be almost no
different from when it was not enabled.
Caching Parameters:
These parameters can
potentially speed up rendering, but since there is not much
documentation on this, it probably is difficult to make best use
of it.
Apologies to again refer you to the Lightflow docs.
Processors:
This controls the number of
processors used for rendering. Obviously this is only useful for
multiprocessor systems. I don't have access to that, so I can't
test this, but since these systems are becoming more common, I
included it anyway. It is also probably the easiest method to
speedup rendering.
Volumetric Fog/Spot Halo:
The script supports simple
fog whenever you have an active world enabled in your Blender
scene that has a Mist Distance setting that is not
zero.
As the name suggests, this option is also used to make spotlights volumetric (VOLUMETRIC?) similar to Blender spotlights, or any other lights for that matter, but it has special meaning to spotlights, where the script uses a slightly more optimized method. For more on this see the Light types section.
To make fog and/or spotlight halo's really volumetric, you can use this button to enable it. However, due to the way the script sets this up, both simple and volumetric fog can only be used in "open" scenes, that is, in any scene that has nothing completely surrounding it.
Another disadvantage is that it can be EXTREMELY slow. The main reason things get slow is when the camera can see into 'infinity', that is, places where no objects are in front of the camera. A simple way of speeding this up is by placing a wall in front of the camera to make sure the camera never sees an empty background.
There are two parameters you can adjust:
"Sampling", which sets the sampling interval on a unit viewing
ray, that is, the lower this value the less Lightflow will stop
to look around how much light it still can see through the fog.
The higher the better of course, but as always, also a lot
slower.
"Density cache", which sets the amount of memory Lightflow sets
aside to keep track of shadow calculations (the most
math-intensive part of the calculation). The higher this value,
the faster the rendering. The value is in kilobytes, the default
is 2048Kb or 2MB.
Shadowmap Parameters:
When you use Spotlights or
regular lamps (without _RAY extension), the script creates
special Lightflow lights which use shadowmaps like Blender's
spotlights. This also means that they can suffer similar
artefacts like you might have seen while using Blender
spotlights, ie. objects that look like they are 'floating' for
instance. To control this, these parameters can be used. They are
completely equivalent to the same Blender shadowmap parameters,
with some slight differences:
"BufSize" sets the shadowmap size, this can be set
to any value, although restricted by the script from 64 to 2048
pixels maximum.
"Samples", sets the number of samples used to sample the
shadowmap.
"Bias" controls the offset used to remove shadowmap artifacts, in
Lightflow this hardly ever needs to be changed, only change this
when absolutely necessary.
Finally, the "Settings" button will return you to the settings screen.
Some notes about the picture you see in the render
screen: It is an early tryout to render the Sibenik Cathedral
created by Marco Dabrovic
for the Radiosity
Competition
This screen is where all rendering inside Blender is done.
You can start Lightflow from here and view the results as well.
It is best to use this as a kind of preview system. The larger
square below "Current Render" is the display window. The picture
will be scaled to fit inside the window. The method used might
make the pictures look worse than they really are. Small pictures
will look pixelized (blocky). While large pictures might look
like the anti-aliasing was not switched on. This is just the
effect of the display method, it is not how the picture really
looks.
When you render an animation, the current frame-number will also
be displayed.
RENDER NOW button:
This button starts Lightflow
as a background task. Which means that Blender still works after
you used this. If you don't use the 'AutoDisplay' option , you
could start the render here, and then continue editing your
scene, once in a while checking back to this screen to see how
far Lightflow has progressed. In the case that Lightflow crashes
(which WILL happen from time to time), Blender is not affected
and will continue to work normally.
When you are not happy with the results you see while Lightflow
is rendering and don't want to wait before it is done, you can
stop Lightflow by switching to the DOS-console/terminal and then
use CTRL-C. Sometimes this might stop the script as well, but
most of the time, the script should not be affected by
it.
DISPLAY button:
You probably will not use
this button much since it is much easier to use the R-key (for
Reload) which will load the targa file Lightflow is currently
working on. Since these are often partial results, large parts
might still be black. Also with radiosity the first renders will
look very blocky. These are the pre-calculations Lightflow is
working on. It is not the actual endresult.
Whenever you press the R-key or use the "...Display..." button the script will also check if Lightflow is still busy, it might also report that Lightflow has crashed or maybe just didn't start because of a syntax error in the .py file when you edited it for instance. Just look at the DOS shell/terminal to see the exact cause of the error.
AutoDisplay
(This feature might not always be available on
Linux)
For quick renders you can use the 'AutoDisplay' button. This will
display the picture automatically without the need to press a key
every time you want to display the picture. However, Blender is
locked during the render. Blender can't react to any key presses,
mouse events or any other important event, if you switch to
another window during rendering you won't be able to return to
the full Blender window until it is done. So it is best to only
use this for quick (preview) renders.
You can stop the autodisplay if you want, by switching to the DOS-shell or terminal and pressing any key if you use Windows, or the Enter/Return key if you use Linux. After stopping it, the script will work the same as if you had used the render button without 'AutoDisplay'.
You can set the time to wait before the picture is displayed again with the 'Display Interval' button. This time is in seconds, you can't set it lower than one second and not higher than a minute (60 seconds). This time will also depend on your gfxcard/driver and the image size. If you have an up-to-date OpenGL driver the method used to display the picture will be the fastest (it uses OpenGL directly). If the particular function is not supported by your driver, it could be that large images (>320 X 240) take some time to load and so setting the timing interval too low might make the script continually busy loading a picture without being able to pause. Look at the output of the script when you first start it, if it says 'Fast display, array', you know everything is optimal. If however it says 'Slow display, no array', which might happen especially on Linux, it might be better to not use the 'AutoDisplay' feature at all. It might also say 'Fast display, no array' or 'Slow display, array', then it might sort of work, try for yourself...
After rendering an animation, you can sort of
'play' the animation with the left/right cursor (arrow) buttons.
Holding the right arrow key plays the animation forward, and the
left arrow key plays it backwards. If you want to step through
the animation frame by frame, then use the 4 and 6 keys on the
numerical keypad.
Of course 'playing' the animation also depends on your OpenGL
driver again, if it can't display the pictures fast enough, it
probably looks more like a slideshow...
As usual the "Settings" button brings you back to the main GUI.
Besides Blendfile material/texture import, the script now also supports direct import of MATSpider materials which is probably the best way to construct new Lightflow materials. However, Linux users don't have access to MATSpider, not every Windows user can or wants to use MATSpider, you can't create glass or metal (unless you use environment mapping using an image which is also supported in Lightflow) with the standard Blender materials, and of course it is not always necessary either. For this purpose the script has a number of preset materials. You give the material special meaning to the script by appending a four letter extension to the material name. For instance when the materialname is "Red" and you want to make it a Glass type material, you would change the name to "Red_GLAS". So the material name followed by an underscore plus the material extension.
Due to the lack of full access to Blender
materials with the current python API, not all material
parameters and no material textures are exported.
(UPDATE: Unless you use Blendfile import of
course...)
Also Lightflow DOES NOT do displacement
mapping on polygonal (mesh) objects, it is actually just regular
bump mapping, so where ever you see displacement mapping
mentioned below, read it as 'bump mapping'...
(UPDATE: Again, Unless you use Blendfile import with nurbs
surfaces, which DO support real displacement
mapping)
These are all the possible name extensions you can use with some simple standard "sphere on a plane" example pictures. The checkered floor is a MATSpider material:
No extension:
This is a standard material that is used whenever the
materialname of the object has no extension. The parameters that are used for the Lightflow material
are:
|
![]() Default Blender material settings with the color set to |
DISP:
This material is exactly the same as the 'no extension' material, but when you use a uv-texture, it will be used for both color and DISPlacement mapping. So all parameters that are used for the 'no extension' material are also valid for this one. IN CASE YOU DID NOT READ THE ABOVE WARNING: LIGHTFLOW DOES NOT SUPPORT DISPLACEMENT MAPPING ON POLYGONS, IT WILL DO NORMAL BUMP MAPPING INSTEAD ! THE ONLY WAY TO DO REAL DISPLACEMENT MAPPING IS BY USING NURBS/SURFACES WITH THE "BLENDFILE IMPORT" OPTION. |
![]() "_DISP" material with a sphere-mapped
uv-texture, |
DISO:
Same as "DISP", the uv-texture this time is used for DISplacement mapping Only. |
![]() "DISO" material, the same settings as for "DISP",
as |
GLAS:
This is a basic Glass type material, unfortunately, for obvious reasons (Blender has no refraction material parameters) this time some of the parameters won't make much sense as far as the Blender render results is concerned, but hopefully you won't find it too hard to use. The Blender material parameters which have an effect on the
Lightflow parameters are:
If a uv-texture is used, the mirror color, specular color, transmission color and specular transmission color are modulated (mulitiplied) by it. If you don't have caustics enabled, you will see fake caustics instead, however this only works properly with "RAY" lightsources or a "Sun" light (see Light types). Just as an example, to make a simple clear glass,
set all color sliders for |
![]() "GLAS" material, here a slightly blue tinted
standard |
DIGL:
Exactly the same parameters as the "GLAS" material. A uv-texture is used as a DIsplacement map on a GLass material. |
![]() "DIGL" material using a sphere-mapped
uv-texture, |
DIOG:
Same as "DIGL", but the uv-texture is used for DIsplacement mapping Only on a Glass material. |
![]() "DIOG" material, the uv-texture only affects displacement. |
METL:
This is a basic Metal type material, again, like the "GLAS" material, some of the Blender parameters don't have the same meaning as they would have in Blender. The following Blender parameters affect the Lightflow
material:
If the object has a uv-texture, the material diffuse color, specular color, and mirror color are modulated (multiplied) by the texture color. As an example, to make a mirror type material, set rgb to black (this is the objects own color), set Spec and Mir colors to white and set Ref anywhere near or to the maximum for an IOR of 26 |
![]() "METL" material with color set to a dark blue and a maximum IOR. |
DIME:
Uses the same as parameters as "METL", uv-texture is used as a DIsplacement map on a MEtal material. |
![]() "DIME" material, which uses a sphere-mapped |
DIOM:
Again, same parameters as "METL", the uv-texture is used for DIsplacement mapping Only on a Metal material. |
![]() "DIOM" material, exactly the same as the "DIME" |
AMBI:
A shadeless material will be created which can be used to emulate lighting when used with radiosity, it also accepts uv-textures. Only the RGB sliders to set the color are used. This is the same material as used by a WORLD LIGHT, where it functions as an AMBIent lightsource. |
![]() "AMBI" material, here radiosity is activated so you |
There is one other extension which was used
in all of the examples, "_SPID" which is used to replace the
Blender material with a MATSpider material.
For instance the MATSpider material used for the floor was called
"checker1", so in Blender the name of the material was
"checker1_SPID".
By default the material itself is not used at all, only the name
is used. However, there are ways to make use of the actual
Blender material settings, I refer you to the MATSpider tutorial for
more information on this topic.
Also remember that it is now possible to use multiple materials on a single mesh, they will be exported correctly. Multiple UV-textures are not supported, only the first image found is used (it can be done by using Blendfile import again).
Light types
Almost all Blender light types are now supported
with the exception of an exact match for 'Hemi' which is more of
a hack which doesn't really look like Blender Hemi's. Quad mode
lights are not supported, there were too many difficulties to
match the light falloff and intensity. This is also true for the
other light types but generally you will only notice this when
lights are very distant, the Lightflow result will always tend to
be darker than the Blender image. The opposite is true also when
lights are ver y close to objects, then they might often look too
bright. If you do find it too dark/bright, it probably is best to
try changing the Dist value first, this changes the intensity
faster than the Energy value.
The only light which does match the one from Blender exactly (at
least, it should), is the "Sun" light type.
There are several Blender light mode options that will have an influence on the Lightflow lights as well;
Here is a list of all possible light types and the adjustable parameters:
Lamp
This is the normal Blender 'Lamp' type, there are two possible
Lightflow types. The first is a Lightflow 'soft' light which,
like Blender, uses shadowmaps to approximate soft shadows. You can change this light to a raytraced point lightsource
with sharp shadows, by appending '_RAY' to the lamp or lamp
object name. So if the lamp name is 'LAMP1', you can change it to
a pointlight by changing it's name to 'LAMP1_RAY'. |
From left to right: Blender, Lightflow 'soft' light and Lightflow with lamp having the '_RAY' extension (a pointlight). |
Spot
This light simulates a spotlight. When not using the '_RAY' name extension, like in Blender, the Lightflow light ('soft-conic') uses shadowmaps to approximate soft shadows. When you append '_RAY to the lamp or lampobject name, like for regular Lamps (see above), it will create a spotlight which produces sharp shadows. All the same parameters as for 'Lamp' are adjustable: Color, Energy & distance, and when using the shadowmapped light, also the Quad1 slider to change shadow softness, and in addition you can use the 'SpoBl' slider to change the edge sharpness, exactly like for the Blender spotlight. When you have 'Halo' enabled, the script will do some magic to
try to produce similar results in Lightflow. Be very careful with the 'Sampling' parameter in the render engine parameters screen, for quick previews set this to 1.0 or even less. This can take a very long time to calculate, although it is faster than the spot & volumetric fog combination, which can take even longer, especially when the camera 'looks into nothingness'. If you do want to speed that up, use the already mentioned 'wall in front of the camera' method. And last, for the 'fake' halo method, it is best to adjust the 'Dist' value until the base of the spot lightcone you can see in Blender's 3D view is just below a floor or behind a wall or anything else, otherwise you will see some artefacts. Don't set it too high, the fake halo might get too weak otherwise. !!! WARNING !!! |
Left to right: Blender, Lightflow shadowmapped light & Lightflow spotlight with '_RAY' name extension.
|
Sun
This will simulate a inifinitly distant lightsource like the sun. The only parameters used are color and energy, distance is not used. Warning!! Currently sun-lights don't work when you have 'Shadows' enabled and use fog in your scene. |
Left to right: Blender sun, Lightflow sun ('directional'). |
Hemi
An exact match for this light type is not available in Lightflow, instead I tried to approximate it a bit by actually using two Lightflow lights at the same time. Like sun lights only two parameters are adjustable: Color & Energy. When you enable 'Square' mode, the script will do something completely different, it will create a Lightflow real arealight instead ('patch'). The parameters to change are: Color, Energy and distance. To control the size of the arealight, you can use the 'SpoSi' slider. The value of this divided by ten is used to scale the arealight. Be very careful with this! The larger the arealight, the EXPONENTIALLY longer the rendering times. The shadow accuracy may be controlled with the "lighting threshold" parameter in the Render engine section. Anti-aliasing will also add to the render time. PLIGHT |
Left to right: Blender Hemi, Lightflow Hemi simulation, Lightflow 'Hemi+Square' which produces an automatic real arealight. |
For the Lightflow 'Lamp' and 'Spot' types without '_RAY', remember that since these are lights which use shadowmaps, they can produce similar undesirable behaviour as they sometimes do in Blender: Objects that appear to float for instance when lights are very far away, or maybe shadows appear blocky (improbable but could happen). You can try to control these by changing the 'Shadow map parameters' in the render engine section. These are exactly equivalent to the parameters with the same name in Blender.
Things To Remember
Lightflow WILL crash from time to time, I can't do
much about that. Most crashes for me happen at the end of the
render, which means that the render itself is intact. There are
probably some cleanup issues in Lightflow, the crashes seem to
occur often when using a lot of textures and radiosity/caustics.
Lightflow will also create a lot of temporary files that it does
not always delete, especially when it crashes. These files can be
HUGE (several megabytes). You will have to manually delete these
from time to time, they only take up a lot of space and do
nothing. These files will all be in whatever directory you have
assigned to the LIGHTFLOWTEMP environment variable. So for
windows this could be the Windows/Temp directory, but it is
better to create a special directory for this purpose as I
described in the LFPROBLEMS.txt ,
since you can then simply select all files at once and delete
them. The same is true for Linux, check whatever directory you
assigned to the LIGHTFLOWTEMP environment variable.
The files look like tmp_x_xxxxx, so names like: tmp_2_a12345 or
tmp_pc_b45836 for instance.
If you use the MATSpider interface module (MSPint.pyd): Don't try running the game-engine while you work with the script, the MSPint module will fail to load, causing a "_PyImport_FixupExtension" error. I still don't know what the cause of that is.
When you have edited mesh vertices or faces or uv-coordinates, make sure the mesh files menu is set to 'Always Write', otherwise the edits won't be propely exported. But as soon as your done with edits in editmode, switch to 'check if it exists first', especially when you work with large meshes. It is very easy to use up memory. Again, not my fault, I tried to minimize it as much as possible, but I can't undo the Python API bugs...
When you don't want smooth shading, using '_AS00' is not necessary anymore, you only need to click 'Set Solid' in Blender. In fact, like in Blender, it is possible to set some faces to smooth shading while others have solid shading enabled. The exported mesh will look (more or less) exactly the same in Lightflow.
These explanations are not necessarily accurate, I'm no expert by any means...
Glossary
Radiosity
Is a energy distribution
calculation method, it actually originates from physics where it
is used to calculate how heat spreads throughout an environment
(I think...). The same method was applied for lighting (also a
form of energy) and the name stuck ever since, first results are
from as early as the 1940's(!) where the results where put
together by hand ! (Really true, see here for yourself: Radiosity History). So it really has
nothing to do with Lightflow or raytracing perse. A name which
much better describes the method(s) used by Lightflow is 'Global
Illumination', that is, lighting calculations that incorporate
light coming from (and going to) everywhere, wherever it
originates from. So direct light, indirect light bouncing of
walls, light reflected by mirrors, light refracted by glass, and
so on...
Caustics
Is the term used for the
patterns of light you see when light is reflected of a metallic
object and/or refracted by a transparent
object/material.
Volumetric
rendering
Is a method to calculate how
much light is absorbed/scattered when bouncing off particles
while it travels through space. Examples of this are fog, clouds,
smoke, dust, fire, sometimes used with hair/fur/grass too. Also
used in medical imaging.
For more Raytracing buzzwords and their explanation, see this site: FuzzyPhoton
Useful links
Allmost all Lightflow related links are Japanese, they really like Lightflow in Japan. But even if you don't understand Japanese these can be very useful: