WorldViz User Forum  

Go Back   WorldViz User Forum > Vizard

Reply
 
Thread Tools Rate Thread Display Modes
  #1  
Old 11-09-2006, 04:28 PM
jrodman jrodman is offline
Member
 
Join Date: Jun 2003
Posts: 21
Image blur

We're trying to simulate visual degredation in humans.

Is there a way to blur the image that the user sees in Vizard? We tried fog and placing a semi-transparent sheet in front of the camera, but neither of those are quite right. We want something that looks like the camera is out of focus. Something that merges adjacent pixels together.
Reply With Quote
  #2  
Old 11-09-2006, 11:12 PM
k_iwan k_iwan is offline
Member
 
Join Date: May 2006
Posts: 115
Red face ...vizard plugins?

If I'm not mistaken...You need to create a custom renderer plug ins for Vizard to do this, or by using OpenGL Shader Language.

Since I'm only a 3d modeler, I don't have the head show you how this is done :P


Regards,
k_iwan
Reply With Quote
  #3  
Old 11-10-2006, 10:29 AM
farshizzo farshizzo is offline
WorldViz Team Member
 
Join Date: Mar 2003
Posts: 2,849
I have a sample script that performs a uniform blur over the entire screen. Is this what you are looking for? The script only works with the latest version of Vizard 3.0 and requires a decent graphics card that supports shaders. Are you able to use 3.0? Also, do you need this effect to work in stereo, or just mono?
Reply With Quote
  #4  
Old 11-15-2006, 02:43 PM
jrodman jrodman is offline
Member
 
Join Date: Jun 2003
Posts: 21
Quote:
Originally Posted by farshizzo
I have a sample script that performs a uniform blur over the entire screen. Is this what you are looking for? The script only works with the latest version of Vizard 3.0 and requires a decent graphics card that supports shaders. Are you able to use 3.0? Also, do you need this effect to work in stereo, or just mono?
Yes, we are using 3.0. I'm not sure if we have a decent graphics card, but I'd certainly like to try it. We only need mono.
Thanks!
Reply With Quote
  #5  
Old 11-15-2006, 03:40 PM
farshizzo farshizzo is offline
WorldViz Team Member
 
Join Date: Mar 2003
Posts: 2,849
I've attached a zip file which contains a script testBlur.py. This script depends on the included vizblur.py module. This example will only run on Vizard 3.00.1723 and above. You will need a graphics card that supports shaders.

Also, in the future please post Vizard 3.0 related questions in the Vizard 3.0 forum.
Attached Files
File Type: zip VizardBlur.zip (2.0 KB, 2150 views)
Reply With Quote
  #6  
Old 11-16-2006, 02:07 AM
k_iwan k_iwan is offline
Member
 
Join Date: May 2006
Posts: 115
Hi,

I tried it, but did not run.. seems like something is missing?!?

-------------------------------------------------------
FRAGMENT glCompileShader "" FAILED
FRAGMENT Shader "" infolog:
ERROR: 0:4: '{' : syntax error parse error
ERROR: 1 compilation errors. No code generated.
glLinkProgram "" FAILED
Program "" infolog:
Link failed. All shader objects have not been successfully compiled.
FRAGMENT glCompileShader "" FAILED
FRAGMENT Shader "" infolog:
ERROR: 0:4: '{' : syntax error parse error
ERROR: 1 compilation errors. No code generated.
glLinkProgram "" FAILED
Program "" infolog:
Link failed. All shader objects have not been successfully compiled.
-----------------------------------------------------------
a "vizblur.pyc" was created in the same folder after executing this demo.

I'll try again tonight. (I'm running on latest beta version... VizardR3 b3)... next time I will post under Vizard 3 forum.

Regards,
Iwan

Last edited by k_iwan; 11-16-2006 at 02:09 AM.
Reply With Quote
  #7  
Old 11-16-2006, 10:48 AM
farshizzo farshizzo is offline
WorldViz Team Member
 
Join Date: Mar 2003
Posts: 2,849
It seems your graphics card doesn't support the latest version of GLSL. You might try updating your driver if you already haven't. I'll try to make the shader compatible with older drivers as well.
Reply With Quote
  #8  
Old 11-16-2006, 03:44 PM
k_iwan k_iwan is offline
Member
 
Join Date: May 2006
Posts: 115
Talking

Hi

I thought so... My graphics card at home is ATI9600Pro (I wondered why I picked ATI in the first place anyway, I always have problem with overheating)
I doubt by updating to the latest driver would solve this problem but I will give it a try.

I will try again with Uni's computers.


Regards,
Iwan
Reply With Quote
  #9  
Old 11-16-2006, 03:48 PM
jrodman jrodman is offline
Member
 
Join Date: Jun 2003
Posts: 21
Quote:
Originally Posted by farshizzo
It seems your graphics card doesn't support the latest version of GLSL. You might try updating your driver if you already haven't. I'll try to make the shader compatible with older drivers as well.
I had the same problem. I'll try upgrading my driver.

Also, I posted here because I was hoping to find a solution that worked under Vizard 2.5 as well. The main project we need it for is 3.0, so it's OK if it doesn't work in 2.5.
Reply With Quote
  #10  
Old 11-20-2006, 11:53 AM
farshizzo farshizzo is offline
WorldViz Team Member
 
Join Date: Mar 2003
Posts: 2,849
I've attached another version which should work on your card. This script will only work with the latest version of Vizard 3.0.
Attached Files
File Type: zip VizardBlur.zip (1.9 KB, 2000 views)
Reply With Quote
  #11  
Old 12-05-2006, 10:08 PM
k_iwan k_iwan is offline
Member
 
Join Date: May 2006
Posts: 115
Talking Thanks

Nice! this one works on my ancient ati 3d card.

iwan
Reply With Quote
  #12  
Old 12-06-2006, 02:50 PM
jrodman jrodman is offline
Member
 
Join Date: Jun 2003
Posts: 21
Quote:
Originally Posted by farshizzo View Post
It seems your graphics card doesn't support the latest version of GLSL. You might try updating your driver if you already haven't. I'll try to make the shader compatible with older drivers as well.
We're going to buy a new machine to run on. What do we need to look for in a graphics card to get this to work?
Reply With Quote
  #13  
Old 12-06-2006, 03:05 PM
jrodman jrodman is offline
Member
 
Join Date: Jun 2003
Posts: 21
Quote:
Originally Posted by farshizzo View Post
I've attached another version which should work on your card. This script will only work with the latest version of Vizard 3.0.
The blurring in this one works for me. But, I'm having a number of problems with the script.

First, it gives me a series of 6 error dialog boxes, each titled:
"Producer: Internal Error [] Can't create the temporary window." where [] is box representing an invalid character.
In the dialog box, it says:
"The operation completed successfully."

The menu items that you create do not work for me. I get the "Producer" errors even when I comment out the menu items, so I think they are not the same problem.

The program is really slow to start and to close, but once it's running, it's reasonably speedy.

Finally, when I increase the "blurScale" to 3 or more, it starts to look pixelated. I presume this is because there's no diagonal blurring. I think I can modify the script myself to blur over a box or circle. I will attempt to do so now.

Still, this looks like it should do what I want. Thanks!
Reply With Quote
  #14  
Old 12-06-2006, 03:16 PM
farshizzo farshizzo is offline
WorldViz Team Member
 
Join Date: Mar 2003
Posts: 2,849
Can you provide some system specs so that we can try to reproduce the errors? (Operating system, service pack, graphics card, driver version, etc...)

Regarding graphics cards, I recommend nVidia GeForce cards.
Reply With Quote
  #15  
Old 12-07-2006, 05:16 PM
jrodman jrodman is offline
Member
 
Join Date: Jun 2003
Posts: 21
Quote:
Originally Posted by jrodman View Post
Finally, when I increase the "blurScale" to 3 or more, it starts to look pixelated. I presume this is because there's no diagonal blurring. I think I can modify the script myself to blur over a box or circle. I will attempt to do so now.
Having looked a little closer, I now realize that the separate vertical and horizontal passes do generate diagonal blurring. I now think that the pixelation is generated by the fact that there is a fixed number of samples taken to generate the blurring. And this means that when the scale gets too big, it starts skipping pixels.

I tried increasing the number of samples, but the current code adds one line of c++ for each sample. There appears to be a limit to the amount of c++ code that can be passed to the render node's apply function.

Next I tried rewriting the c++ to include a for loop so there wouldn't have to be a separate line for each sample. This code, however, seems to be crashing. I have no way to debug it since it's getting run inside Vizard where I can't attach a debugger or print debug statements.

Here's my code:

Code:
		blur_source = """
		uniform sampler2D srcImage;
		uniform float blurScale;
		void main(void)
		{
			vec4 color = vec4(0,0,0,0);
			float numSamples = blurScale * 4.0;
			for(float index = -numSamples; index <= numSamples; index += 1.0)
			{
				float offset = index / %size%;
				float temp = index / numSamples;
				`float weight = 0.05 + ((1.0 - temp * temp) / 4.0);
				color += ( texture2D( srcImage, gl_TexCoord[0].xy + vec2( %modifier% ) ) * weight );
			}
			color.a = 1.0;
			gl_FragColor = color / 1.7;
		}"""

		#Create horizontal blur shader
		temp_source = blur_source.replace('%size%', str(float(size[0])) )
		blur_code = temp_source.replace('%modifier%', 'offset * blurScale, 0.0' )
		horzBlurShader = viz.addShader(frag=blur_code)

		#Create vertical blur shader
		temp_source = blur_source.replace('%size%', str(float(size[1])) )
		blur_code = temp_source.replace('%modifier%', '0.0, offset * blurScale' )
		vertBlurShader = viz.addShader(frag=blur_code)
I think I left the rest of your script the same. When I run this, I get a notification that a program crashed, but winviz is left running, taking up all of the CPU. I'm guessing this means that winviz spawns a process to run the c++, that process crashes, and winviz is left waiting for it.

Any ideas what would cause this? Is it worth pursuing if we're going to get a video card that will run your other script?
Reply With Quote
  #16  
Old 12-07-2006, 05:42 PM
farshizzo farshizzo is offline
WorldViz Team Member
 
Join Date: Mar 2003
Posts: 2,849
If you want a really blurry look, then just decrease the size of the blur texture. The default is 512x512. Try 128x128 or 64x64.
Code:
import vizblur
vizblur.enable(128,128) #Lower texture size creates more blur
Reply With Quote
  #17  
Old 12-08-2006, 10:50 AM
farshizzo farshizzo is offline
WorldViz Team Member
 
Join Date: Mar 2003
Posts: 2,849
I've attached a better version of the blur example that allows you to specify the blur kernel size. Previously it was 9. In the test script I have set it to 40. Surprisingly this didn't affect performance on my machine. A larger blur kernel will create more blur. Once you specify the blur kernel, you cannot change it. I recommend setting the blur kernel to the point where it matches your maximum desired blur. This way you can keep the blur scale under 1.0, which will prevent pixelization.
Attached Files
File Type: zip VizardBlur.zip (2.0 KB, 1990 views)
Reply With Quote
  #18  
Old 12-12-2006, 05:12 PM
jrodman jrodman is offline
Member
 
Join Date: Jun 2003
Posts: 21
Quote:
Originally Posted by farshizzo View Post
Can you provide some system specs so that we can try to reproduce the errors? (Operating system, service pack, graphics card, driver version, etc...)

Regarding graphics cards, I recommend nVidia GeForce cards.
My laptop's specs:
Windows XP Professional Service Pack 2
Dell Inspiron I9300
Intel M 1.73GHz
ATI Mobility Radeon X300 Driver version: 6.14.10.6483

I think the only significant problem I'm having now is the "Producer" error dialog boxes that I mentioned before. It's possible we won't have that problem when we order the new machine that we'll run the actual experiments on. But it would be nice for things to work on my laptop too.

Thanks for all your help on this.
Reply With Quote
  #19  
Old 12-13-2006, 04:38 PM
jrodman jrodman is offline
Member
 
Join Date: Jun 2003
Posts: 21
Quote:
Originally Posted by farshizzo View Post
Regarding graphics cards, I recommend nVidia GeForce cards.
We're thinking of replacing the graphics cards in our current machines.

Would a 3DFUZION GeForce 6200 with 128 MB be good enough? It's the best card I can find that goes in a PCI slot.
Reply With Quote
  #20  
Old 12-15-2006, 04:48 PM
farshizzo farshizzo is offline
WorldViz Team Member
 
Join Date: Mar 2003
Posts: 2,849
I haven't tried that specific card, but I don't think it will cause any problems. My computer has a GeForce 6800 which shouldn't be too different from a 6200.
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 09:12 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
Copyright 2002-2023 WorldViz LLC