Quantcast
Channel: GLSL / Shaders - Processing 2.x and 3.x Forum
Viewing all 212 articles
Browse latest View live

PShader Library Example, Processing 3, "Cannot compile fragment shader..."

$
0
0

Hello Team,

I feel foolish that I can't figure this out. I am literally copying the code example from the processing reference for blur.

https://processing.org/reference/PShader.html

And, I get this error in Processing 3 and 2).

"Cannot compile fragment shader: ERROR 0:9 '<' : syntax error"

I must be missing something simple yes? :)

PShader blur;

void setup() {
  size(640, 360, P2D);
  // Shaders files must be in the "data" folder to load correctly
  blur = loadShader("blur.glsl");
  stroke(0, 102, 153);
  rectMode(CENTER);
}

void draw() {
  filter(blur);
  rect(mouseX-75, mouseY, 150, 150);
  ellipse(mouseX+75, mouseY, 150, 150);
}

blob detection in glsl shader

$
0
0

Hello,

For an installation, I use a kinect V2 to obtain a deep map of a space. I need to calculate some blob centroid from that deep map. I try with BlobDetection, blobscanner or openCV libraries and it was very slow, not usable. At the moment the best solution is to send 2 preprocessed depth image to Isadora via Syphon, use 2 Eyes++ actors in Isadora (blob detector actors) and send the result via OSC to processing to finally use fragment shader to process the final image.

It's complicated but it works at 30 f/s in processing and Isadora with very acceptable lag.

I search a way to do the blob decoding in GLSL shader via processing.

Is any one with an idea how to do that?

Thank you in advance. Jacques

texCoord problem

$
0
0

hello! its me again. im in a hardest process of learning GLSL. As i can see, exist many types of sintaxis and versions for writing a shader program and its a little bit confuse writing a shader for Processing. In this time, i`ve two questions.

1) im reading OpenGl 4.0 Shading Lenguaje Cookbook. One of the first examples, use it a block of uniforms. How do i set a uniform block in Processing? cos if i set one by one it doesnt work.

in fragment shader;

uniform BlobSettings {
  vec4 InnerColor;
  vec4 OuterColor;
  float RadiusInner;
  float RadiusOuter;
};

In processing:

   sh.set("RadiusOuter", 10.0 );
   sh.set("RadiusInner", 20.0 );
   sh.set("InnerColor", c);
   sh.set("OuterColor", b);

2) Following this light house tutorial http://www.lighthouse3d.com/tutorials/glsl-tutorial/texture-coordinates/ im trying to do the example of textCoord, and it doesnt work... here the code:

PShader sh;
PShape t;

import peasy.*;
import peasy.org.apache.commons.math.*;
import peasy.org.apache.commons.math.*;
PeasyCam cam;

void setup(){
cam = new PeasyCam(this, 800);
size(500,500, P3D);
  t = trian();
  sh = loadShader("basicShaderFrag.glsl", "basicShader.glsl");
}

void draw(){
  background(0);
  shader(sh);
  shape(t);

}

PShape trian (){

  PShape t = createShape();
  t.beginShape(QUAD_STRIP);
   // t.fill(random(255),0,0);
  t.vertex(-200, -200);
   // t.fill(0,255,0);
  t.vertex(-200, 200);
   // t.fill(0,0,255);
  t.vertex(200,-200);
  t.vertex(200, 200);
  t.endShape();

  return t;
}

vertexShader:

#version 330

uniform mat4 transform;

in vec4 position;
in vec3 color;
in vec2 texCoord;

out vec3 vertColor;
out vec2 TexCoord;

void main(){

  TexCoord = texCoord;
  vertColor = color;
  gl_Position = transform * position;
}

fragmentShader:

#ifdef GL_ES
precision mediump float;
#endif

in vec3 vertColor;
in vec2 TexCoord;


void main() {


    gl_FragColor = vec4(TexCoord,1.0,1.0);
}

was suppossed to be this result:

http://www.lighthouse3d.com/wp-content/uploads/2013/02/texturecoordinates.jpg

but its only a blue rect. and its becouse TexCoord value is 0 and im lost why... how fragCoord works?

thanks.

GLSL and video Texture

$
0
0

hello everyone. Im trying to pass a video as a texture into a fragment shader. but the sketch crash when i run it.

here is the code:

  import processing.video.*;
import peasy.*;
import peasy.org.apache.commons.math.*;
import peasy.org.apache.commons.math.*;

PeasyCam cam;
PShader sh;
float count;
Movie mov;
PGraphics p;

void setup (){

size(1440, 900, P3D );
mov = new Movie(this, "osc_noc.mov");
mov.play();
p = createGraphics(width,height);
cam = new PeasyCam(this, 500);
sh = loadShader("basicShaderFrag.glsl", "basicShader.glsl");
}

void movieEvent(Movie m) {
  m.read();
}

void draw(){
background(0);
shader(sh);
count +=0.09;
sh.set("u_time", count);

sphere(100);
p.beginDraw();
p.background(0);
p.image(mov, 0, 0, 200, 200);
p.endDraw();
sh.set("tex",p);
// image(p, 5, 260 ,200, 200);
}

#version 150

uniform mat4 transform;
uniform sampler2D tex;

in vec4 position;
in vec2 texCoord;
in vec3 normal;

out vec2 TexCoord;

void main(){

  TexCoord = texCoord;
  gl_Position = transform * position;
}


#ifdef GL_ES
precision mediump float;
#endif

#define PI 3.14

in vec2 TexCoord;
uniform float u_time;
uniform sampler2D tex;

void main(){
  vec2 uv = TexCoord;

  gl_FragColor = vec4(texture(tex, TexCoord));
}

a white screen appears, and next it close. the console just say: "Finished". it may be a bug? i could pass a PImage as a texture. but when i link the fragment and shader program into the sketch folder then crash. ..

Need example code for texture shader

$
0
0

Hi. I trying to draw texture on shader, but its not work. What I do wrong? bg drawn correct.


PShader mShader;
    PImage bg;
    PImage tex;
    void setup() {
      size(640, 360, P2D);
      noStroke();
      textureMode(NORMAL);
      bg = loadImage("bg.jpg");
      tex = loadImage("smoke.png");
      mShader = loadShader("texfrag.glsl", "texvert.glsl");
      mShader.set("texture", tex);
      background(255);
    }
    void draw() {
      image(bg,0,0);
      shader(mShader);
    }

texfrag.glsl:

#ifdef GL_ES
precision mediump float;
precision mediump int;
#endif

uniform sampler2D texture;

varying vec4 vertColor;
varying vec4 vertTexCoord;

void main() {
  gl_FragColor = texture2D(texture, vertTexCoord.st) * vertColor;
}

texvert.glsl:


uniform mat4 transform;
uniform mat4 texMatrix;

attribute vec4 position;
attribute vec4 color;
attribute vec2 texCoord;

varying vec4 vertColor;
varying vec4 vertTexCoord;
uniform sampler2D texture;

void main() {
  gl_Position = transform * position;
  vertColor = color;
  vertTexCoord = texMatrix * vec4(texCoord, 1.0, 1.0);
}

Or need sample code for texture draw on shader.

Some ShaderToy shaders converted from Processing

$
0
0

Hi all

Sharing some of the shaders I have converted from ShaderToy to work with Processing 3... Will be adding more as I do them.

Currently there is: - 3 types of VHS effect - Glitch - Sobel edge detection with neon outline - Convert different colours to different ASCII characters

https://github.com/georgehenryrowe/ShadersForProcessing3

:)

transformation using glsl shaders

$
0
0

i need to implement transformation on a 3d object like a cube using glsl shader and i am unable to firgure out how to do it. Can anyone guide me please as i m new to glsl shaders?

P2D & P3D renderer error on MacOS

$
0
0

Basically I get this error when I try to use either the P2D or P3D renderers. I have seen this problem on the forums before, but they are all using windows or there's some sort of antivirus that prevents the PDE from creating a temp folder. I'm on macOS and I do not have any antivirus installed. It doesn't have anything to do with the code. I am, however, able to export it and run the application flawlessly.

java.lang.UnsatisfiedLinkError: Can't load library: /Users/daineesvang/natives/macosx-universal//nativewindow_awt
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1827)
    at java.lang.Runtime.load0(Runtime.java:809)
    at java.lang.System.load(System.java:1086)
    at com.jogamp.common.jvm.JNILibLoaderBase.loadLibraryInternal(JNILibLoaderBase.java:624)
    at com.jogamp.common.jvm.JNILibLoaderBase.access$000(JNILibLoaderBase.java:63)
    at com.jogamp.common.jvm.JNILibLoaderBase$DefaultAction.loadLibrary(JNILibLoaderBase.java:106)
    at com.jogamp.common.jvm.JNILibLoaderBase.loadLibrary(JNILibLoaderBase.java:487)
    at jogamp.nativewindow.NWJNILibLoader.access$000(NWJNILibLoader.java:39)
    at jogamp.nativewindow.NWJNILibLoader$1.run(NWJNILibLoader.java:49)
    at jogamp.nativewindow.NWJNILibLoader$1.run(NWJNILibLoader.java:41)
    at java.security.AccessController.doPrivileged(Native Method)
    at jogamp.nativewindow.NWJNILibLoader.loadNativeWindow(NWJNILibLoader.java:41)
    at jogamp.nativewindow.jawt.JAWTUtil.<clinit>(JAWTUtil.java:336)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at com.jogamp.nativewindow.NativeWindowFactory$3.run(NativeWindowFactory.java:346)
    at com.jogamp.nativewindow.NativeWindowFactory$3.run(NativeWindowFactory.java:342)
    at java.security.AccessController.doPrivileged(Native Method)
    at com.jogamp.nativewindow.NativeWindowFactory.initSingleton(NativeWindowFactory.java:342)
    at com.jogamp.newt.NewtFactory$1.run(NewtFactory.java:68)
    at java.security.AccessController.doPrivileged(Native Method)
    at com.jogamp.newt.NewtFactory.<clinit>(NewtFactory.java:65)
    at processing.opengl.PSurfaceJOGL.initIcons(PSurfaceJOGL.java:498)
    at processing.opengl.PSurfaceJOGL.initFrame(PSurfaceJOGL.java:134)
    at processing.core.PApplet.initSurface(PApplet.java:10889)
    at processing.core.PApplet.runSketch(PApplet.java:10776)
    at processing.core.PApplet.main(PApplet.java:10476)
A library relies on native code that's not available.
Or only works properly when the sketch is run as a 32-bit application.

Lot of shapes with a shader?

$
0
0

I have a for loop that I am running on the CPU to draw thousands of 2D circles per frame, using ellipse(). Could I pass the locations of all of these ellipses to a shader somehow and see a good performance increase?

It seems like this should be easy to do with a frag shader but I am pretty inexperienced and can't find any examples of doing anything similar in processing.

edit: I think the PixelFlow customParticles example has everything, just not very straightforward adapting it

Vertex of shapes in shader

$
0
0

I wonder why the vertexes for shapes that was drawn direct on the sketch instead of creating an PShape, sends other vertexes to the shader then the one that was drawn on a PShape. For example, when I change the per pixel lightning example from http://www.processing.org/tutorials/pshader/ and change the the filter:

void main() {
  gl_Position = transform * vertex;
  vec3 ecVertex = vec3(modelview * vertex);

  ecNormal = normalize(normalMatrix * normal);
  lightDir = normalize(lightPosition.xyz - ecVertex);
  vertColor = sin(vertex);
}

so the color is generated by the position. This works as expected. But when I change the sketch so it draws the can in every draw call like this:

void createCan(float r, float h, int detail) {

  beginShape(QUAD_STRIP);
  noStroke();
  for (int i = 0; i <= detail; i++) {
    float angle = TWO_PI / detail;
    float x = sin(i * angle);
    float z = cos(i * angle);
    float u = float(i) / detail;
    vertex(x * r, -h/2, z * r, u, 0);
    vertex(x * r, +h/2, z * r, u, 1);
  }
  endShape();
}

the colors start to flicker and change in every loop step. Can someone explain this?

PGraphics and PeasyCam

$
0
0

hi! im trying to do a multipass shader effect. the problem is that peasyCam dont apply the transformations in the buffer.

PShader sh;
PShader sha;

PGraphics buffer;
PGraphics buffer2;



import com.hamoid.*;
import themidibus.*;
import peasy.*;
import peasy.org.apache.commons.math.*;
import peasy.org.apache.commons.math.*;

PeasyCam cam;
float count;

void settings () {

  fullScreen(P3D);
}

void setup(){

background(0);


  cam = new PeasyCam(this, 500);


  sh = loadShader("basicShaderFrag.glsl", "basicShader.glsl");
  sha = loadShader("frag2.glsl", "basicShader.glsl");

 buffer = createGraphics(width, height, P3D);
 buffer2 = createGraphics(width, height, P3D);

}

void draw(){

    background(0);
     // println(frameRate);
    count +=0.005;
    sh.set("u_time", count);

    buffer.beginDraw();
    render(buffer);
    buffer.endDraw();

    buffer2.beginDraw();
    buffer2.background(0);
    buffer2.shader(sh);
    buffer2.image(buffer,0, 0);
    buffer2.endDraw();


    cam.getState().apply(buffer2);    // here the question. the image show the shader but like it was a 2D screen.
    cam.beginHUD();
    image(buffer2, 0,0);
    cam.endHUD();

}


void render(PGraphics a){

  a.background(0, 50);

  a.noStroke();
  s.run();
    a.sphere(200);
}

if i skip the buffer2 and apply the shader into buffer it works fine.

i was follow this post https://github.com/jdf/peasycam/issues/25 but it doest work for me.

what im doing wrong?

Flat shading for 3D objects?

$
0
0

Is there a way to disable the nice smooth shading in P3D sketches? For example, to look like this:

_SHADING

When making files for 3D printing, I find it's nice to have just flat shading, so you can see all the triangles.

Trouble setting uniforms for PShaders used by secondary renderers

$
0
0

I'm trying to do some low-level OpenGL rendering onto a separate PGraphics buffer for a library I'm rewriting, and I've run into an issue where I can get my geometry to render on the separate PGraphics buffer, and if I bind a PShader, it will work, but I can't set any uniforms on the PShader, since it appears that the PShader.set() is only capable of acting on the main window's PGL instance. Is that accurate? Is there a workaround other than to just run the uniform setting calls directly from the separate PGraphics' PGL?

  1. create a PGraphics offscreenBuffer, and run offscreenBuffer.beginDraw();
  2. Set up a PGL context, PGL pgl = offscreenBuffer.beginPGL();
  3. Create and bind a PShader
  4. Do all my OpenGL stuff on offscreenBuffer's PGL
  5. Try and set a uniform on my shader with myShader.set("greenChannel", 0.5f);
  6. Shader runs, but output on the offscreenBuffer PGL thinks that the uniform's value is 0

How do use a sampler2d array in Processing?

$
0
0

I would like to keep a frame buffer in the shader (say, the last 1 second of video).

To do that I tried with uniform sampler2D tex[30]; and from Processing sometimes I try writing into that array with shader.set("tex[" + F + "]", frame);

Somehow it doesn't work. It behaves as if there is only one frame and the frame index is completely ignored. It always displays the last received video frame, no matter which indices I use for reading or writing.

I thought about having 30 sampler2D variables, but then I could not access them by index... Any ideas on how to achieve this?

Setting projection and modelview matrices manually

$
0
0

Hi. I'd like to copy the projection and modelView matrices from one PGraphics into another one so I can render different passes of the same scene no matter what camera library is in use.

I have tried using get() and set() of both matrices like this (and different alternatives like projmodelView and combinatins of set() and apply() ):

PMatrix3D modelView = ((PGraphicsOpenGL) pgSrc).modelview.get();
PMatrix3D projection = ((PGraphicsOpenGL) pgSrc).projection.get();
((PGraphicsOpenGL) pgDst).projection.set(projection);
((PGraphicsOpenGL) pgDst).modelview.set(modelView);

I also tried the pure OpenGL calls from the method copyMatrices() from here

None of them worked. I can reuse the same PGraphics for each pass, but this would be more convenient. Does anyone know how it should be done?

Thanks!


Render depth buffer onto 2d PGraphics

$
0
0

I would like to use the depth buffer of a PGraphics3D in a texture shader on a PGraphics2D. Now I could not really find a way to share the depth buffer of the 3d scene with the 2d texture, so I decided to draw the 3D depth as fragment shader onto a texture and then use this texture as separate input to the 2d texture fragment shader.

In the example sketch, you can enable the depth pass shader for the 3D scene (left picture) with any key. So the depth buffer is correctly drawn onto the screen.

Now the problem that I have is, that I am not able to shade / filter the 3d context with the depth buffer shader, and draw it onto the 2d canvas. Without doing that, I am not able to use it inside the next texture shader I would like to apply to the 2d graphics.

I hope it is not too confusing. What I want is to have the depth information in a 2d texture shader.

Attached is the example sketch, which currently is working on the right side (no depth information is shaded onto the 2d context).

void createDepthImage(PGraphics3D graphics)
{
  if (!graphics.is3D())
    return;

  // add shader to graphics
  graphics.shader(shader);

  depthImage.beginDraw();
  depthImage.image(graphics, 0, 0);
  depthImage.endDraw();

  // reset shader after operation
  if (!shaderApplied)
    graphics.resetShader();
}

Complete sketch: gist.github.com/d6a61bcfe1ffc4f36eb1592d7143ab8f

Get pixel values and displace the vertex?

$
0
0

I want to write GLSL shaders that use the pixel values of the image as magnitude in order to displace the vertices of an object in the direction of the surface normal. Incorporate a point light and demonstrate the result.

How to port this shader to Processing 3?

$
0
0

Shader in question: https://shaderfrog.com/app/view/1078?view=shader

I'm a complete GLSL noob and was wondering how to go about applying this shader to an imported .obj in Processing.

My sketch is currently as follows:

PShape obj;
PShader shader;

void setup() {
  size(360, 720, P3D);
  obj = loadShape("mesh.obj");
  shader = loadShader("shaderFrogFrag.glsl", "shaderFrogVert.glsl");
}

void draw() {
  shader.set("color", 0.3, 0.8, 0.8);
  shader.set("secondaryColor", 0.2, 0.4, 0.7);
  shader.set("lightPosition", 0.6, 0.0, 2.0);

  shader(shader);

  translate(width/2, height/2);  
  shape(obj);
}

The shader code is directly from the site.

Vert:

/**
* Example Vertex Shader
* Sets the position of the vertex by setting gl_Position
*/

// Set the precision for data types used in this shader
precision highp float;
precision highp int;

// Default THREE.js uniforms available to both fragment and vertex shader
uniform mat4 modelMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat3 normalMatrix;

// Default uniforms provided by ShaderFrog.
uniform vec3 cameraPosition;
uniform float time;

// Default attributes provided by THREE.js. Attributes are only available in the
// vertex shader. You can pass them to the fragment shader using varyings
attribute vec3 position;
attribute vec3 normal;
attribute vec2 uv;
attribute vec2 uv2;

// Examples of variables passed from vertex to fragment shader
varying vec3 vPosition;
varying vec3 vNormal;
varying vec2 vUv;
varying vec2 vUv2;

void main() {

  // To pass variables to the fragment shader, you assign them here in the
  // main function. Traditionally you name the varying with vAttributeName
  vNormal = normal;
  vUv = uv;
  vUv2 = uv2;
  vPosition = position;

  // This sets the position of the vertex in 3d space. The correct math is
  // provided below to take into account camera and object data.
  gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );

}

Frag:

/**
* Example Fragment Shader
* Sets the color and alpha of the pixel by setting gl_FragColor
*/

// Set the precision for data types used in this shader
precision highp float;
precision highp int;

// Default THREE.js uniforms available to both fragment and vertex shader
uniform mat4 modelMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat3 normalMatrix;

// Default uniforms provided by ShaderFrog.
uniform vec3 cameraPosition;
uniform float time;

// A uniform unique to this shader. You can modify it to the using the form
// below the shader preview. Any uniform you add is automatically given a form
uniform vec3 color;
uniform vec3 secondaryColor;
uniform vec3 lightPosition;

// Example varyings passed from the vertex shader
varying vec3 vPosition;
varying vec3 vNormal;
varying vec2 vUv;
varying vec2 vUv2;

void main() {

  // Calculate the real position of this pixel in 3d space, taking into account
  // the rotation and scale of the model. It's a useful formula for some effects.
  // This could also be done in the vertex shader
  vec3 worldPosition = ( modelMatrix * vec4( vPosition, 1.0 )).xyz;

  // Calculate the normal including the model rotation and scale
  vec3 worldNormal = normalize( vec3( modelMatrix * vec4( vNormal, 0.0 ) ) );

  vec3 lightVector = normalize( lightPosition - worldPosition );

  // An example simple lighting effect, taking the dot product of the normal
  // (which way this pixel is pointing) and a user generated light position
  float brightness = dot( worldNormal, lightVector );

  // Fragment shaders set the gl_FragColor, which is a vector4 of
  // ( red, green, blue, alpha ).
  gl_FragColor = vec4( mix(secondaryColor,color,brightness), 1.0 );

}

The code compiles but the screen is blank. I'm not sure why. Does the implementation of OpenGL used in Processing have a different syntax? For example, I noticed that in the PShader guide (https://processing.org/tutorials/pshader/) there is a different capitalization scheme than on the site: like 'modelviewMatrix' vs. 'modelViewMatrix'.

So what exactly would I need to change to get the code working? I know this might be a big question, and I apologize. Alternatively, if anybody has some links that could point me in the right direction it'd be greatly appreciated.

Draw points from texture stored locations?

$
0
0

I want to pass a shader a texture of encoded locations, and have it draw points at the decoded locations.

I have a particle system stored in a texture that's 3000x1, with the x and y locations encoded into RGBA.

Currently I'm having to use the CPU to loop through particles and use point() to draw it to a new texture. I know this is being done properly in a shader in the PixelFlow library, but I can't figure it out, studying particleRender.glsl.

How can I get a shader to replicate whats going on in draw()? It feels like it should be easy but from what I've read on the PShader page and book of shaders I can't piece it together.

Edit: I've updated with my attempt at hacking away at PixelFlow's particleRender.glsl, it isn't throwing errors, but it isn't drawing anything either. I'm not used to the way version 150 works so maybe it's something simple. I've tried lots of trouble shooting and I can't get that shader to draw anything at all.

PGraphics pgLocs; 

color xyToRGBA(PVector loc) {
  PVector l = loc.copy();
  l.mult(255);
  float xr = l.x-floor(l.x);
  float yr = l.y-floor(l.y);
  xr *= 255;
  yr *= 255;
  return color(floor(l.x), floor(xr), floor(l.y), floor(yr));
}

PVector RGBAtoXY(color c) {
  float r = red(c)/255.0;
  float g = green(c)/65025.0;
  float b = blue(c)/255.0;
  float a = alpha(c)/65025.0;
  return new PVector((r+g), (b+a));
}

PShader psRender;

PGraphics pgSprite;
void setup() {
  size(800, 800, P3D);
  pgLocs = createGraphics(3000, 1, P2D);
  pgSprite = createGraphics(20, 20, P2D);

  pgSprite.beginDraw();
  pgSprite.background(0);
  pgSprite.fill(255);
  pgSprite.ellipseMode(CENTER);
  pgSprite.ellipse(10, 10, 20, 20);
  pgSprite.endDraw();

  psRender = loadShader("pointVert.glsl", "pointFrag.frag");
  psRender.set("tex_position", pgLocs);
  psRender.set("tex_sprite", pgSprite);
  psRender.set("wh_viewport", width, height);
  psRender.set("wh_position", 3000, 1);
  psRender.set("point_size", 4);

  pgLocs.beginDraw();
  for (int i = 0; i < pgLocs.width; i++) {  // fill pgLocs with random locations
    PVector loc = new PVector(random(width)/width, random(height)/height);
    pgLocs.set(i, 0, xyToRGBA(loc));
  }
  pgLocs.endDraw();

  noLoop();
}

void draw() {
  stroke(0);
  strokeWeight(5);
  background(255);
  //image(pgSprite,0,0);
  shader(psRender); // What I wish would work
  //fill(255);
  //rect(0, 0, width, height);


  for (int i = 0; i < pgLocs.width; i++) { // What I'd like to do in a shader instead
    color c = pgLocs.get(i, 0); //get pixel color
    PVector loc = RGBAtoXY(c); // decode location
    stroke(c);                 // set color just for fun
    point(loc.x*width, loc.y*height); // show location was stored in the texture properly
  }
}

pointFrag.frag

#version 150

uniform float     point_size;
uniform ivec2     wh_position;
uniform sampler2D tex_position;
uniform vec4      col_A = vec4(1, 1, 1, 1.0);
uniform vec4      col_B = vec4(0, 0, 0, 0.0);

out vec2 location;

vec2 posDecode(vec4 c) {
  float r = c.r;
  float g = c.g/255.0;
  float b = c.b;
  float a = c.a/255.0;
  return vec2((r+g), (b+a));
}

void main(){
  int point_id = gl_VertexID;
  int row = point_id / wh_position.x;  
  int col = point_id - wh_position.x * row;

  vec4 color = vec4(texture2D(tex_position, ivec2(col, row)));
  location = posDecode(color);
  gl_Position  = vec4(location * 2.0 - 1.0, 0, 1); 
  gl_PointSize = point_size;
}

pointVert.glsl

#version 150
uniform float     point_size;
uniform vec2      wh_viewport;
uniform sampler2D tex_position;
uniform sampler2D tex_sprite;
uniform vec4      col_A = vec4(1, 1, 1, 1.0);
uniform vec4      col_B = vec4(0, 0, 0, 0.0);

out vec4 out_frag;
in vec2 location;

void main(){
  vec2 my_PointCoord = ((location * wh_viewport) - gl_FragCoord.xy) / point_size + 0.5;
  float falloff = 1.0 - texture(tex_sprite, my_PointCoord.xy).a; 
  out_frag  = mix(col_A, col_B, falloff);  
  out_frag  = clamp(out_frag, 0.0, 1.0);
}

Are Processing's shaders fast enough?

$
0
0

It's said that OpenFrameworks runs much faster than Processing, so many artists choose OF in live performances other than Processing. But I just wonder how fast it is to use Processing's shaders comparing with OF, is it fast enough for an audiovisual live performance?

Viewing all 212 articles
Browse latest View live