Blendfile Import

General information
Since there is currently no python support for the required information to fully export things like materials/textures etc., the script imports all the needed information from a Blend file. This has the disadvantage that you have to save your file before you export EVERYTIME you change anything that can only be accessed in this manner, meaning materials, textures, surfaces and metaballs. This does not mean that it won't work when you don't save, only that the information is used from the last time you saved the file. So in case you make a change to something and the result doesn't seem to change, make sure that you save first. It also does not work for single file animation export. It does work however when rendering the animation directly from Blender, but not for individual movement of metaballs or deformed surfaces. Animated materials using IPO's can be used.

Materials/Textures
First, every single texturing step is converted to a lightflow pattern which can make the output file quite large! You will probably find it very hard to edit these parts if you want to do that, although I tried to name things in such a way so that at least the relevant colormapping blocks are recognizable, LFCOL for 'Col', LFREF for 'Ref', LFSPEC for 'Spec', LFNOR for 'Nor', etc...

Not all texture mapping types can be converted to Lightflow equivalents, only UV/Object/Glob/Orco & Refl are supported. Nor mapping simply has no Lightflow equivalent. ('Nor' as in 'Normal mapping', not Blender 'Nor' bumpmapping of course which is exported as displacement mapping).

Blender AutoTexSpace buttonThere doesn't seem to be texturespace coordinates specified in the blendfile, this is probably handled internally. This means that the script has to calculate texturespace itself, which might not always produce the same result as Blender. The best way to handle this is to always have 'AutoTexSpace' enabled in the edit button section of Blender, this seems to do almost exactly what the script does. This also means that you can't use texturespace placement using the T-key, since you cannot move the Lightflow texture with it. The easiest method of texture placement is handled using the same method as you would use in Blender, using an Empty (or in fact any other object) with 'Object' mapping. You can then easily scale, rotate, and move the texture.

Unfortunately, Lightflow seems to have another bug when using 'cylinder' mapping (Blender 'Tube'), this doesn't seem to work at all. For this reason, any texture in Blender that uses 'Tube' mapping will actually be exported as 'sphere'-mapping. 'Cube' mapping is also not available in LF, a possibility would be to emulate that with uv-coordinates. So basically, only 'Flat' and 'Sphere' work as normal.

The texture repeat parameters also cannot be emulated with any Lightflow parameter. So don't expect to see the same result in Lightflow when setting the xrep/yrep buttons to any other value but one. It works better for textures in 'Repeat' mode, 'Clip' has problems if the size parameters are not set to 1.0 and using repeat at the same time.

For the mapping axis buttons (the xyz buttons to the left of the size buttons), setting any two or more axes to the same axis, doesn't work.

Texture Size (sizeX, sizeY, sizeZ) and Offset (ofsX, ofsY, ofsZ) should work reasonably well most of the time.

From the Image mapping buttons in the texture button section, only 'Rot90' is used.

Environment mapping is ignored, although Lightflow can actually do environment mapping. This has nothing to do with the 'Refl' texture mapping setting, you can use this as normal with regular images, just not with Blender Environment mapping. (This really would be superfluous in LF, it would have to render six images for an environment map where it would be much better and realistic to just use raytraced reflection/refraction.)

Other Ignored parameters are: the texture minxy/maxxy, StartFr/Len, Offset/Frames/Fra/Fie/Ima/Cyclic, and Filter buttons.

Extend has the same effect as Repeat, and ClipCube is the same as Clip.

From the procedural textures, the following are supported: Clouds, both 'Default' and 'Color', as well as 'Soft' and 'Hard' noise types. Wood, always as rings, 'BandNoise' or 'RingNoise' will enable turbulence like in Blender. Marble, Soft/Sharp/Sharper are not used. Blend textures are supported but might not be quite right yet. Stucci, basically the same as clouds, but used as displacement map instead. (note that the others can of course also be used as displacement maps) For all of these (except stucci), ColorBands are supported as well.

Don't expect to see the exact same result in Lightflow when using procedural textures, only a vague similarity, or maybe even something completely different.

The texture brightness/contrast/color sliders are currently only implemented for image textures.

Specular 'Hard' mapping is not quite right, there seems to be another bug in LF that prevents any other pattern use but the texture itself.

'Alpha' mapping is basically faked, there is no support for Alpha mapping in Lightflow. For this reason, the Lightflow result will also not be the same.

'Stencil' mode only works for Color mapping at the moment.

Click for a larger version

Blender material render
Lightflow material render

These two examples show an overview of some materials as they would look in Lightflow when using Blendfile import.
The top sphere has an image mapped with 'Refl' enabled, so it looks like it is environment mapped. The sphere below that has two texture layers, one for the panel-image, which is sphere mapped and a noisy texture on top with 'No RGB' enabled and mixed in proportion with the layer below for a rusty type effect. The left-most cube has a uv-mapped texture to compensate for the lack of cube-mapping in Lightflow. This does not mean the uv-editor was used, I just clicked 'make TexFace' in the editor button section of Blender, and then enabled 'UV' for the texture map (the button left of 'Object')
The see-through box next to it uses the same uv-mapping, the texture is used as an alpha-map. I put a '_RAY' light inside the box so you can see that the lightrays and shadows outside the box are filtered by the transparent faces.
The remaining boxes all use a procedural material, the red and white box has a marble texture, the box below that a wood texture, the red-green-blue box uses a colorcloud texture, and finally the right-most box has a stucci-texture for bumpmapping.
As you can see the procedural textures can look quite different.

Metaballs
In Blender, metaballs behave like one object, that is, when you create metaballs out of edit mode, the metaballs will still influence other metaballs not created in this group. The material of the metaball group that gets a material first will also be be shared by all other metaball groups as well. The script exports these groups separately, also using a single material as in Blender. There is however no influence among groups of metaballs, so if you want all metaballs to behave as one, create them in edit mode only. For animations, only the group as a whole is animated, not individual metaball elements. So at the moment they are probably not very useful. Metaballs can look quite different in Lightflow as well (shape).

Blender metaballs
Lightflow metaballs

The left side is the Blender render, and the right side is the Lightflow render. As you can see the Lightflow metaballs are shaped slightly differently as well as bit smaller.

Curves/Nurbs/Surfaces
Currently weight information is not used in the export. It is best to set all weights for all points to the same value in Blender if you want to see the same shape in the Lightflow render. For example, a 'SurfaceDonut' (torus) will not look the same in Lightflow as in Blender, since Blender uses different weights for different points. To see the same shape as in Lightflow, you will have to select all points and set all weights to any of the weight values in the edit button window. (edit mode, select all (a-key), click any of the value buttons like '1.0'', sqrt(0.5), etc.. or type a value in the 'Weight:' button, then click 'Set Weight').

In Blender you can also change the shape of the surface with the Uniform/EndPoint/Bezier UV buttons, most of the time 'Uniform' will work best when exporting to Lightflow. You can use Endpoint and Bezier, but these are not really correct in the Lightflow render, the export script just uses some functions that probably have nothing to do with what Blender does, but nevertheless might be usefull to quickly alter the general shape. Toggling 'Cyclic U/V' in edit mode will also work for the Lightflow shape.

Rendered in Blender Rendered in Lightflow Displacement mapped surface
Blender result, one light, purposely set a bit dark, since with radiosity the darkness will be filled in with bounced light from the walls. Lightflow result with radiosity, looks brighter caused by the bounced light from the reddish surrounding walls. The mouse seems to be more 'awake' too... Another advantage of surfaces is that you can use real displacement mapping, like in this bizarre example above.
I just used a MATSpider material on a exported blender surface with a depth setting of 1.0. Displacement mapped surfaces can take long to render since the surfaces are divided up into lots of tiny triangles.
The best candidates for successful export are skinned surface curves, like this toonmouse head created by svo & Kib_Tph. (see the tutorial from the old Blender.nl site, now on IngieBee's site)

WARNING: Since the export is not really 100% and Lightflow also has it's bugs, sometimes Lightflow will crash when trying to render a surface. Sometimes Lightflow simply seems to halt or get into a infinite loop, you will have to stop it by using CTRL-C. If you have used the Uniform/Endpoint/Bezier buttons and the object becomes invisible in the 3d-view, don't try to render with Lightflow anyway, since that seems to be one of the occasions when the inifinite loop occurs.