hi.
i like your application concept! win xp still fails but have win7 x64 here.
i’d like to know:
if there opengl shading planned or wireframe mode available to adjust scene. also can’t get how to assign sun direction(vector) for interior test.
i have average quad machine with average gpu but it lags while scene tweaking all the time and annoys with sampling flickering. would like to work with mesh(ortho+perspective,normals alignment)simple and clean, proceed with render after.
2.does it render on gpu?
3.does it support light layers. and image progress save.(to continue any time after stop)
4.does it allign normals automatically.
5.does it support microfacet brdf.
…
please point me with link if some info is already explained.
thanks!
Hi,
It’s great you like FluidRay RT concept!
1.If you look on the top of the render view, there is an icon that allows you to change the render mode. Change it to “quickshade” and you’ll have something very similar to an OpenGL preview.
1a. to change the sun direction, you have to click on the sun/sky environment (did you assign it to the scene?) and in the attribute editor change the sun direction
1b. orthographic camera is coming soon
2.FluidRay RT is CPU based, so you don’t have any GPU memory limitation
3.Light layers and progress save are coming soon
4.can you please explain what align normals mean?
5.yes, microfacet brdfs are used in the plastic, metal and metallic paint materials
Did you have the chance to take a look at the tutorial page?
thanks there!
1.yes i managed that some alternative shading modes are available. though even simple shading lags cpu and flickers. i just think if you have heavy scene and want to turn it on 10 degrees it starts rerendering it again and so on each time… how helpful simple wireframe mode would be without any rendering! +in wireframe mode sun could have a vector visible to drag it and point wherever you need with visual feedback(to point in a small window for example)
2.i know there is hybrid(cpu+gpu) style exists. where gpu used for intersection acceleration and does not limit ram used.
4.when you import mesh it could be with “inconsistent shading normals” usually this fixed in modelling application but great help could come from smart rendering app.
5.tutorial? seems like good point!
i shall try, probably today, more thoroughly-and give some feedback again.
There is in fact some lag in the viewport. We are working on the issue, it will be fixed soon.
Since the hybrid approches use the GPU to store the acceleration structure, and that is what uses most of the ram, they still have memory limitation problems. Those hybrid solutions get around the problem switching to full CPU mode when the scene does’t fit the GPU ram.
Here we chose to use Intel’s Embree ray-tracing kerners, since they provide a very good performance in most scenarios, and you don’t have to deal with all the complexities of an hybrid approach.
Good point about inconsistent shading normals, we’ll add it to the to do list.
Looking forward to more feedback!
i had chance to test it a bit today. and my main wish relates to functionality for now mostly.
what is default scaling of fluidray while it accepts whatever came from .obj file? isn’t it important for physically based renderer to manipulate with correct scaling. i still believe simple wireframe with scaling grid and skydome is a must for any standalone visualization studio
example:
then hit render and get it.
also separated small material preview window is needed,instead of rerendering of 10mln polygons with indirect lighting, while only color of a pencil was changed for example.
path tracer looks pretty fast.but bidirectional is in a no way real time,and looks does not handle caustics well currently.
…what else?- yes it should save image (of specified format) in user chosen time intervals.(for example “output untonemapped .hdr each 30min”)
going to proceed with testing in free time,and may be shall find something new.
Thanks for the great feedback. We already put your suggestions in the to do list.
.obj file is imported as it is, no scaling.
The bidirectional path tracer is experimental, there is more work to do on it.