Quantcast
Channel: GLSL / Shaders - Processing 2.x and 3.x Forum
Viewing all 212 articles
Browse latest View live

How does the point shader work with custom attributes?

$
0
0

Hi, I'm trying to make a particles system where particles positions and velocities are calculated with a shader. I'm stuck because I cannot understand how some parameters of the point shader work.

In particular I cannot get a point shader to take custom attributes. Is that possible?

1) I tried the attrib() function which works fine with triangles and QUADS but when i set the shape to be POINTS type then it stops working.

2) I tried to pass some data using the default vertex attributes such us 'uv' or 'normal' but it does not work either.

3) So I tried to have a solution based on QUADS instead of using a point shader, but then, when I do like in the Static Particles Retained example and use createShape(PShape.GROUP) again custom attributes stop working and I get a null Pointer Exception :

`

particles = createShape(PShape.GROUP) ;

sprite = loadImage("sprite.png");

for (int n = 0; n < npartTotal;  n++) {

float cx = random(-500, +500);
float cy = random(-500, +500);
float cz = random(-500, +500);

PShape part = createShape();
part.beginShape(QUAD);
part.noStroke();
part.tint(255);
part.texture(sprite);
part.normal(0, 0, 1);

part.attrib("custom",1.0); //this does not work

part.vertex(cx - partSize/2, cy - partSize/2, cz, 0, 0);
part.vertex(cx + partSize/2, cy - partSize/2, cz, sprite.width, 0);
part.vertex(cx + partSize/2, cy + partSize/2, cz, sprite.width, sprite.height);
part.vertex(cx - partSize/2, cy + partSize/2, cz, 0, sprite.height);
part.endShape();
particles.addChild(part);`

4) So I decided to use normal QUADS, without groups or anything, and seems to work, but at that point I'm stuck because I can't get each quad to act like a billboard and always face the screen. Is there a way to have quads always facing the Camera?

It would be very practical if point shaders support custom attributes.

5) I can't get point shader to use strokeWeight attribute per vertex, like line shader does. So what is the correct way to render each point with a different radius?

Last question! Taking inspiration from the advanced openGL example I was able to render a buffer of vertices with the pointSprite feature of openGL where each vertex corresponds to a point. I was able to pass textures and custom attributes such as uv, size, etc, but at that point I was just using low level calls and processing was not very useful.

So, Is there a way to render each vertex as a point sprite using just the processing API?


Multipass Shader Example

$
0
0

Is there somewhere a multipass rendering example for processing 3.0? Like a bloom effect or something like that. I tried now for a long time with the processing advanced GL example using PJOGL but I think I need maybe an example to continue.

Best regards Florian

How to port Shadertoy multipass GLSL shader

$
0
0

Hi,

I'm trying to port these Shadertoy fragment shaders to Processing using PShader: https://www.shadertoy.com/view/XsG3z1#

I'm not sure I understood how to correctly do the multipass.

Here's my attempt so far:

BufA.frag:

// Reaction-diffusion pass.
//
// Here's a really short, non technical explanation:
//
// To begin, sprinkle the buffer with some initial noise on the first few frames (Sometimes, the
// first frame gets skipped, so you do a few more).
//
// During the buffer loop pass, determine the reaction diffusion value using a combination of the
// value stored in the buffer's "X" channel, and a the blurred value - stored in the "Y" channel
// (You can see how that's done in the code below). Blur the value from the "X" channel (the old
// reaction diffusion value) and store it in "Y", then store the new (reaction diffusion) value
// in "X." Display either the "X" value  or "Y" buffer value in the "Image" tab, add some window
// dressing, then repeat the process. Simple... Slightly confusing when I try to explain it, but
// trust me, it's simple. :)
//
// Anyway, for a more sophisticated explanation, here are a couple of references below:
//
// Reaction-Diffusion by the Gray-Scott Model - http://www.karlsims.com/rd.html
// Reaction-Diffusion Tutorial - http://www.karlsims.com/rd.html

uniform vec2 resolution;
uniform float time;
uniform int frame;

uniform sampler2D iChannel0;


// Cheap vec3 to vec3 hash. Works well enough, but there are other ways.
vec3 hash33(in vec2 p){
    float n = sin(dot(p, vec2(41, 289)));
    return fract(vec3(2097152, 262144, 32768)*n);
}

// Serves no other purpose than to save having to write this out all the time. I could write a
// "define," but I'm pretty sure this'll be inlined.
vec4 tx(in vec2 p){ return texture2D(iChannel0, p); }

// Weighted blur function. Pretty standard.
float blur(in vec2 p){

    // Used to move to adjoining pixels. - uv + vec2(-1, 1)*px, uv + vec2(1, 0)*px, etc.
    vec3 e = vec3(1, 0, -1);
    vec2 px = 1./resolution.xy;

    // Weighted 3x3 blur, or a cheap and nasty Gaussian blur approximation.
    float res = 0.0;
    // Four corners. Those receive the least weight.
    res += tx(p + e.xx*px ).x + tx(p + e.xz*px ).x + tx(p + e.zx*px ).x + tx(p + e.zz*px ).x;
    // Four sides, which are given a little more weight.
    res += (tx(p + e.xy*px ).x + tx(p + e.yx*px ).x + tx(p + e.yz*px ).x + tx(p + e.zy*px ).x)*2.;
    // The center pixel, which we're giving the most weight to, as you'd expect.
    res += tx(p + e.yy*px ).x*4.;
    // Normalizing.
    return res/16.;

}

// The reaction diffusion loop.
//
void main(){


    vec2 uv = gl_FragCoord.xy/resolution.xy; // Screen coordinates. Range: [0, 1]
    // vec2 uv = (gl_FragCoord.xy * 2.0 - resolution.xy) / resolution.y;
    vec2 pw = 1./resolution.xy; // Relative pixel width. Used for neighboring pixels, etc.


    // The blurred pixel. This is the result that's used in the "Image" tab. It's also reused
    // in the next frame in the reaction diffusion process (see below).
    float avgReactDiff = blur(uv);


    // The noise value. Because the result is blurred, we can get away with plain old static noise.
    // However, smooth noise, and various kinds of noise textures will work, too.
    vec3 noise = hash33(uv + vec2(53, 43)*time)*.6 + .2;

    // Used to move to adjoining pixels. - uv + vec2(-1, 1)*px, uv + vec2(1, 0)*px, etc.
    vec3 e = vec3(1, 0, -1);

    // Gradient epsilon value. The "1.5" figure was trial and error, but was based on the 3x3 blur radius.
    vec2 pwr = pw*1.5;

    // Use the blurred pixels (stored in the Y-Channel) to obtain the gradient. I haven't put too much
    // thought into this, but the gradient of a pixel on a blurred pixel grid (average neighbors), would
    // be analogous to a Laplacian operator on a 2D discreet grid. Laplacians tend to be used to describe
    // chemical flow, so... Sounds good, anyway. :)
    //
    // Seriously, though, take a look at the formula for the reacion-diffusion process, and you'll see
    // that the following few lines are simply putting it into effect.

    // Gradient of the blurred pixels from the previous frame.
    vec2 lap = vec2(tx(uv + e.xy*pwr).y - tx(uv - e.xy*pwr).y, tx(uv + e.yx*pwr).y - tx(uv - e.yx*pwr).y);//

    // Add some diffusive expansion, scaled down to the order of a pixel width.
    uv = uv + lap*pw*3.0;

    // Stochastic decay. Ie: A differention equation, influenced by noise.
    // You need the decay, otherwise things would keep increasing, which in this case means a white screen.
    float newReactDiff = tx(uv).x + (noise.z - 0.5)*0.0025 - 0.002;

    // Reaction-diffusion.
    newReactDiff += dot(tx(uv + (noise.xy-0.5)*pw).xy, vec2(1, -1))*0.145;


    // Storing the reaction diffusion value in the X channel, and avgReactDiff (the blurred pixel value)
    // in the Y channel. However, for the first few frames, we add some noise. Normally, one frame would
    // be enough, but for some weird reason, it doesn't always get stored on the very first frame.
    if(frame > 9) gl_FragColor.xy = clamp(vec2(newReactDiff, avgReactDiff/.98), 0., 1.);
    else gl_FragColor = vec4(noise, 1.);

}

shader.frag:

// Reaction Diffusion - 2 Pass
// https://www.shadertoy.com/view/XsG3z1#

/*
    Reaction Diffusion - 2 Pass
    ---------------------------

    Simple 2 pass reaction-diffusion, based off of "Flexi's" reaction-diffusion examples.
    It takes about ten seconds to reach an equilibrium of sorts, and in the order of a
    minute longer for the colors to really settle in.

    I'm really thankful for the examples Flexi has been putting up lately. From what I
    understand, he's used to showing his work to a lot more people on much bigger screens,
    so his code's pretty reliable. Reaction-diffusion examples are temperamental. Change
    one figure by a minute fraction, and your image can disappear. That's why it was really
    nice to have a working example to refer to.

    Anyway, I've done things a little differently, but in essense, this is just a rehash
    of Flexi's "Expansive Reaction-Diffusion" example. I've stripped this one down to the
    basics, so hopefully, it'll be a little easier to take in than the multitab version.

    There are no outside textures, and everything is stored in the A-Buffer. I was
    originally going to simplify things even more and do a plain old, greyscale version,
    but figured I'd better at least try to pretty it up, so I added color and some very
    basic highlighting. I'll put up a more sophisticated version at a later date.

    By the way, for anyone who doesn't want to be weighed down with extras, I've provided
    a simpler "Image" tab version below.

    One more thing. Even though I consider it conceptually impossible, it wouldn't surprise
    me at all if someone, like Fabrice, produces a single pass, two tweet version. :)

    Based on:

    // Gorgeous, more sophisticated example:
    Expansive Reaction-Diffusion - Flexi
    https://www.shadertoy.com/view/4dcGW2

    // A different kind of diffusion example. Really cool.
    Gray-Scott diffusion - knighty
    https://www.shadertoy.com/view/MdVGRh


*/

uniform sampler2D iChannel0;
uniform vec2 resolution;
uniform float time;

/*
// Ultra simple version, minus the window dressing.
void main(){

    gl_FragColor = 1. - texture2D(iChannel0, gl_FragCoord.xy/resolution.xy).wyyw + (time * 0.);

}
//*/


//*
void main(){


    // The screen coordinates.
    vec2 uv = gl_FragCoord.xy/resolution.xy;
    // vec2 uv = (gl_FragCoord.xy * 2.0 - resolution.xy) / resolution.y;

    // Read in the blurred pixel value. There's no rule that says you can't read in the
    // value in the "X" channel, but blurred stuff is easier to bump, that's all.
    float c = 1. - texture2D(iChannel0, uv).y;
    // Reading in the same at a slightly offsetted position. The difference between
    // "c2" and "c" is used to provide the highlighting.
    float c2 = 1. - texture2D(iChannel0, uv + .5/resolution.xy).y;


    // Color the pixel by mixing two colors in a sinusoidal kind of pattern.
    //
    float pattern = -cos(uv.x*0.75*3.14159-0.9)*cos(uv.y*1.5*3.14159-0.75)*0.5 + 0.5;
    //
    // Blue and gold, for an abstract sky over a... wheat field look. Very artsy. :)
    vec3 col = vec3(c*1.5, pow(c, 2.25), pow(c, 6.));
    col = mix(col, col.zyx, clamp(pattern-.2, 0., 1.) );

    // Extra color variations.
    //vec3 col = mix(vec3(c*1.2, pow(c, 8.), pow(c, 2.)), vec3(c*1.3, pow(c, 2.), pow(c, 10.)), pattern );
    //vec3 col = mix(vec3(c*1.3, c*c, pow(c, 10.)), vec3(c*c*c, c*sqrt(c), c), pattern );

    // Adding the highlighting. Not as nice as bump mapping, but still pretty effective.
    col += vec3(.6, .85, 1.)*max(c2*c2 - c*c, 0.)*12.;

    // Apply a vignette and increase the brightness for that fake spotlight effect.
    col *= pow( 16.0*uv.x*uv.y*(1.0-uv.x)*(1.0-uv.y) , .125)*1.15;

    // Fade in for the first few seconds.
    col *= smoothstep(0., 1., time/2.);

    // Done.
    gl_FragColor = vec4(min(col, 1.), 1.);

}
//*/

and the sketch:

//Reaction Diffusion - 2 Pass
// https://www.shadertoy.com/view/XsG3z1

PShader bufA,shader;

void setup(){
  size(640,480,P2D);
  noStroke();

  bufA = loadShader("BufA.frag");
  bufA.set("resolution",(float)width,(float)height);
  bufA.set("time",0.0);

  shader = loadShader("shader.frag");
  shader.set("resolution",(float)width,(float)height);
}
void draw(){
  bufA.set("iChannel0",get());
  bufA.set("time",frameCount * .1);
  bufA.set("frame",frameCount);

  shader(bufA);
  background(0);
  rect(0,0,width,height);

  //2nd pass
  //resetShader();
  shader.set("iChannel0",get());
  shader.set("time",frameCount * .1);
  shader(shader);
  rect(0,0,width,height);
}

The shaders compile and run, but the output is different from what I see on shadertoy: The Processing version gets stable quite fast and it doesn't look like the feedback works.

Orange Book Code to Processing

$
0
0

Hi guys, I'm new with the shaders and I'm starting to read the Orange Book to understand something about it. I'm managed to translate the first example of the book, following the Processing Andreas Colubri's shaders tutorial (https://processing.org/tutorials/pshader/) but it presents strange issue although it's quite similar to the final output shown. If you run the code below, you should notice a strange issue with rotate() that makes the sphere blobby, it seems that the function affects the vertex shader but not the fragment shader and I don't understand why. Anyone can help me?

Processing Code:

PShader shader;

float a = 0.0;

void setup() {
  size(600, 600, P3D);
  noStroke();


  shader = loadShader("OBfrag1.glsl","OBver1.glsl");

  shader.set("BrickColor", 0.5, 0.1, 0.1);
  shader.set("MortarColor", 0.5, 0.5, 0.5);
  shader.set("BrickSize", 0.1, 0.1);
  shader.set("BrickPct", 0.9, 0.9);
}

void draw() {
  background(255);
  shader(shader);

  pointLight(255, 255, 255, width/2, height/2, 500);

  translate(width/2, height/2);
  rotateY(a);

  fill(255);
  sphere(200);

  a += 0.01;
}

Vertex Shader ( OBver1.glsl ):

#define PROCESSING_LIGHT_SHADER

uniform mat4 modelview;
uniform mat4 transform;
uniform mat3 normalMatrix;
uniform vec4 lightPosition;

const float SpecularContribution = 0.2;
const float DiffuseContribution = 1.0 - SpecularContribution;

attribute vec4 vertex;
attribute vec4 color;
attribute vec3 normal;

varying vec4 vertColor;
varying float LightIntensity;
varying vec2 MCposition;

void main() {
    vec3 ecPosition = vec3(modelview * vertex);
    vec3 tNorm = normalize(normalMatrix * normal);

    vec3 lightVec = normalize(lightPosition.xyz - ecPosition);
    vec3 reflectVec = reflect(-lightVec, tNorm);
    vec3 viewVec = normalize(-ecPosition);

    float diffuse = max(dot(lightVec, tNorm), 0.0);
    float spec = 0.0;

    if (diffuse > 0.0) {
        spec = max(dot(reflectVec, viewVec), 0.0);
        spec = pow(spec, 16.0);
    }

    LightIntensity = DiffuseContribution * diffuse +
                     SpecularContribution * spec;

    MCposition = tNorm.xy;
    vertColor = vec4(LightIntensity, LightIntensity, LightIntensity, 1) * color;
    gl_Position = transform * vertex;
}

Fragment Shader ( OBfrag1.glsl ):

#ifdef GL_ES
precision mediump float;
precision mediump int;
#endif

uniform vec3 BrickColor, MortarColor;
uniform vec2 BrickSize;
uniform vec2 BrickPct;

varying vec4 vertColor;
varying float LightIntensity;
varying vec2 MCposition;


void main() {

    vec3 color;
    vec2 position, useBrick;
    position = MCposition / BrickSize;

    if (fract(position.y * 0.5) > 0.5) {
        position.x += 0.5;
    }

    position = fract(position);
    useBrick = step(position, BrickPct);

    color = mix(MortarColor, BrickColor, useBrick.x * useBrick.y);
    color *= LightIntensity;

    gl_FragColor = vec4(color, 1.0) * vertColor;
}

How to draw many instances (Low Level GL)

$
0
0

Is there any example of how to draw many instances of the same PShape or mesh? I found an cinder example for c++ which uploads the transforms with a gl_array and the shader reads out the transformation data.. (the usual way to do that, from what i understand)

I am trying to translate that to PGL, but looking at the Processing low level gl samples, i really need some help / tipps where to start

star-like shinning effect

$
0
0

I am trying to create a shinning effect like the ones shown below. I think you can do this with shaders but I do not have experience with them. I have a logo competition with an deadline of about 6 hrs. Could anyone share some code or point me to an example in processing where I could get started? I am working in P3D mode if that makes any difference. No experience here, just looking for little help or maybe if somebody could share some code of something similar. It will be great to add this touch to my design.

Kf

shiny_star_3

maxresdefault

images

Star Shine

How to use GPU?

$
0
0

Hi folks, anyone knows how to use the GPU in a sketch? I tried in P3D and OpenGL modes with several MacBook and Pc, integrated and Nvidia GTX 980 graphic card, but frame rate and overall performances are the same on every machine.

blob detection in glsl shader

$
0
0

Hello,

For an installation, I use a kinect V2 to obtain a deep map of a space. I need to calculate some blob centroid from that deep map. I try with BlobDetection, blobscanner or openCV libraries and it was very slow, not usable. At the moment the best solution is to send 2 preprocessed depth image to Isadora via Syphon, use 2 Eyes++ actors in Isadora (blob detector actors) and send the result via OSC to processing to finally use fragment shader to process the final image.

It's complicated but it works at 30 f/s in processing and Isadora with very acceptable lag.

I search a way to do the blob decoding in GLSL shader via processing.

Is any one with an idea how to do that?

Thank you in advance. Jacques


Need help on school project regarding UV mapping with texture images

$
0
0

I am having a hard time wrapping my head around how to set UV's for textures. Everytime I try to display shape with the texture it vanishes.

PImage iceCream;
PImage waffleCone;

PShape cone;
PShape cream;

PShader texlightShader;
PShader shader2;
PShader toon;

float angle;
boolean button = false;
//boolean button2 = false;
//boolean button3 = false;


void setup() {
  size(800, 800, P3D);
  iceCream = loadImage("cream.jpg");
  waffleCone = loadImage("waffle.jpg");

  texlightShader = loadShader("texlightfrag.glsl", "texlightvert.glsl");
  shader2 = loadShader("lightfrag.glsl", "lightvert.glsl");

  toon = loadShader("frag.glsl", "vert.glsl");
  toon.set("fraction", 1.0);

  //noLoop();
    frameRate(30);
}

void draw() {
  background(0);
  lights();
  translate(width / 2, height / 1.5);

  ////cameras
    rotateY(angle);
    //rotateX(map(mouseX, 0, width, 0, PI));
    //rotateY(map(mouseX, 0, width, 0, PI));
    rotateZ(map(height, 0, height, 0, -PI));



  ////lights
    //pointLight(255, 255, 255, width/2, height, 600);
    directionalLight(255,255,255, -1, 0, 0);

    //float dirY = (mouseY / float(height) - 0.5) * 2;
    //float dirX = (mouseX / float(width) - 0.5) * 2;
    //directionalLight(204, 204, 204, -dirX, -dirY, -3);


  noStroke();
  fill(0, 0, 255);
  translate(0, -40, 0);

  ///buttons to be implemented later
  //if(!button) {shader(toon);} else {resetShader();}
  //if (!button2){noStroke();}else { stroke(0);


  if(!button) {
  //shader(toon); not working must troubleshoot
  resetShader();
  drawCylinder_noTex(10, 75, 250, 16);}

  else {
  shader(texlightShader);
  cone = drawCylinder(10, 75, 250, 16, waffleCone); }

  //cone = drawCylinder(10, 75, 250, 16, waffleCone);
  //drawCylinder_noTex(10, 75, 250, 16);

 angle += 0.01;
}

PShape drawCylinder(float topRadius, float bottomRadius, float tall, int sides, PImage tex) {
  textureMode(NORMAL);
  PShape sh = createShape();


  sh.beginShape(QUAD_STRIP);
  //sh.noStroke();
  sh.texture(tex);

  for (int i = 0; i < sides + 1; ++i) {
    float angle = 0;
    float angleIncrement = TWO_PI / sides;

    sh.vertex(topRadius*cos(angle), 0, topRadius*sin(angle), 0);
    sh.vertex(bottomRadius*cos(angle), tall, bottomRadius*sin(angle), 100);
    angle += angleIncrement;
  }
  sh.endShape();
  return sh;

/*
  pushMatrix();

    //ice cream
    translate(0,height/3);
    sphere(75);
  popMatrix();

*/
}

void drawCylinder_noTex(float topRadius, float bottomRadius, float tall, int sides) {
  textureMode(NORMAL);
  createShape();

  float angle = 0;
  float angleIncrement = TWO_PI / sides;
  beginShape(QUAD_STRIP);
  //sh.texture(tex);

  for (int i = 0; i < sides + 1; ++i) {
    vertex(topRadius*cos(angle), 0, topRadius*sin(angle));
    vertex(bottomRadius*cos(angle), tall, bottomRadius*sin(angle));
    angle += angleIncrement;
  }
  endShape();

/*
  pushMatrix();

    //ice cream
    translate(0,height/3);
    sphere(75);
  popMatrix();
  */
}

void keyPressed()
  {
    if (key == 'b' || key == 'B') {button = !button;}
    //if (key == 'n' || key == 'N') {button2 = !button2;}  //future implementation
    //if (key == 'm' || key == 'M') {button3 = !button3;} //future implementation
  }

Weird error with P3D

$
0
0

Okay, this was not happening a while ago and I have NO IDEA why this stop working.

It only happens with P3D.

after a background() function happens that the shape that I working always draw in the back and not in the front as it was being draw at first. The weird thing is that is only the body NOT THE STROKE of the shape.

This is my code.

boolean isbg = false;
void setup(){

  size(1200,600,P3D);
     background(0);
}

void draw(){

  if (isbg){
   background(0);
  }
 ellipse(mouseX,mouseY,100,100);

}

void keyPressed(){

  if (key == 'd'){
  isbg = !isbg;
 }
}

This happen EVERYTIME after a background() function runs.

this just destroyed an entire soft that I was working with.
Happens even if is an ellipse, a rect or a personal shape.

I ´VE TRYED EVERYTHING

It is always AFTER the background function runs.

Image Filter Shader Create a line every 20 pixels on x axis

$
0
0

I have been playing about with trying to create a line every 20 pixels on the x axis but the mod does not equal exact zero which I dont know why. When I apply the filter to the image, it creates lines about every 20 pixels but some lines are thicker than a pixel across. I dont get any result for mod < 1 so I had to set it to mod< 1.1 to get the lines. How would I fix the code to get lines every 20 pixels that are only a pixel wide. Thanks

This is on android btw.

precision mediump float;
uniform sampler2D inputImageTexture;
varying vec2 textureCoordinate;

uniform float resX;
uniform float resY;

void main(){
    vec3 color = texture2D(inputImageTexture, textureCoordinate).rgb;

    float a = 0.0;

    if(textureCoordinate.x < 0.0){
        a = (1.0 - (textureCoordinate.x * -1.0)) * (resX/2.0);
    }else{
        a = (textureCoordinate.x * (resX/2.0)) + (resX/2.0);
    }

    float yCoord = floor(a);

    float modX = mod(yCoord, 20.0);

    if(modX < 1.1){
        color = vec3(0.4);
    }

    gl_FragColor = vec4(color, 1.0);
}

How to make this effect (sweater stitch)

$
0
0

Hi, is there a way to make a sweater stitch shader like this? :)

Nebula Shader from Examples (What causes the 'zoom')

$
0
0

I was toying around with the nebula shader example and specifically the nebula.glsl file.

The code is essentially made up of a nebula portion and stars portion. I have questions on the nebula portion and so have commented out the stars portion.

I'm only looking at just one of the nebula that is created (there are actually two in the original example). I stopped the rotation and have been adjusting some of the parameters. I'm trying to find out what is it that seems to be making the nebula zoom in and out. I don't want it to do that. However, I am having trouble pinpointing what is it that is making this zoom effect. Any help would be appreciated.

//Utility functions

vec3 fade(vec3 t) {
  return vec3(1.0,1.0,1.0);//t*t*t*(t*(t*6.0-15.0)+10.0);
}

vec2 rotate(vec2 point, float rads) {
    float cs = cos(rads);
    float sn = sin(rads);
    return point * mat2(cs, -sn, sn, cs);
}

vec4 randomizer4(const vec4 x)
{
    vec4 z = mod(x, vec4(5612.0));
    z = mod(z, vec4(3.1415927 * 2.0));
    return(fract(cos(z) * vec4(56812.5453)));
}

// Fast computed noise
// http://www.gamedev.net/topic/502913-fast-computed-noise/

const float A = 1.0;
const float B = 57.0;
const float C = 113.0;
const vec3 ABC = vec3(A, B, C);
const vec4 A3 = vec4(0, B, C, C+B);
const vec4 A4 = vec4(A, A+B, C+A, C+A+B);

float cnoise4(const in vec3 xx)
{
    vec3 x = mod(xx + 32768.0, 65536.0);
    vec3 ix =   floor(x);
    vec3 fx = fract(x); // returns fractional part of x.
    //vec3 wx = vec3(0,0,0);
    //vec3 wx = vec3(0.0,-1.,1.0);
    //vec3 wx = fx; wx.y=0;

    vec3 wx = fx*fx*(3.0-2.0*fx);
    //vec3 wx = fx*fx*fx;
    //wx.x=0.0;
    float nn = dot(ix, ABC);

    vec4 N1 = nn + A3;
    vec4 N2 =  nn + A4;
    vec4 R1 = randomizer4(N1);
    vec4 R2 = randomizer4(N2);
    vec4 R = mix(R1, R2, wx.x);
    float re = mix(mix(R.x, R.y, wx.y), mix(R.z, R.w, wx.y), wx.z);

    return 1.0 - 2.0 * re;
}
float surface3 ( vec3 coord, float frequency ) {

    float n = 0.0;

    n += 1.0    * abs( cnoise4( coord * frequency ) );
    n += 0.5    * abs( cnoise4( coord * frequency * 2.0 ) );
    n += 0.25   * abs( cnoise4( coord * frequency * 4.0 ) );
    n += 0.125  * abs( cnoise4( coord * frequency * 8.0 ) );
    n += 0.0625 * abs( cnoise4( coord * frequency * 16.0 ) );

    return n;
}

void main( void ) {
    float rads = radians(time*3.15);
    vec2 position = gl_FragCoord.xy / resolution.xy;
    //position += rotate(position, rads); // rotates everything slowly.

    //float n = surface3(vec3(position*sin(time*0.1), time * 0.05)*mat3(1,0,0,0,.8,.6,0,-.6,.8),2.0); //.9  //layer one
    //float n = surface3(vec3(position*sin(time*0.1), time * 0.05) ,2.0); //.9  //layer one

    float n2 = surface3(vec3(position*cos(time*0.1), time * 0.04)*mat3(1,0,0,0,.8,.6,0,-.6,.8),6.0); // .8 // layer two
    //vec2 test = position*cos(time*0.1);
    //float n2 = surface3(vec3(test, time * 0.04) ,6.0); // .8 // layer two
    //float n2 = surface3(vec3(position*cos(time*0.1), 0) ,6); // .8 // layer two . z not affected

    //float n2 = test.x;

    //float lum = length(n);
    float lum2 =  length(n2); //length of a scalar float is abs value.

    //vec3 tc = pow(vec3(1.0-lum),vec3(sin(position.x)+cos(time)+4.0,8.0+sin(time)+4.0,8.0)); // layer one
    vec3 tc2 = pow(vec3(1.1-lum2),vec3(5.0,position.y+cos(time)+7.0,sin(position.x)+sin(time)+2.0)); // layer 2
    //vec3 tc2 = pow(vec3(1.1-lum2,1.1-lum2,1.1-lum2),vec3(5.0,position.y+cos(time)+7.0,sin(position.x)+sin(time)+2.0)); // layer 2
    //vec3 tc2 = pow(vec3(1.1-lum2,1.1-lum2,1.1-lum2),vec3(5.0,7, 2.0)); // layer 2

    vec3 curr_color = /*(tc*0.8) */+ (tc2*0.5);
    //curr_color.z=0;
    curr_color =vec3(lum2, lum2, lum2); //makes black n white
    ///////////////////////////////////////////////end nebula...//////////////////////////////
    //Let's draw some stars

    float scale = sin(0.3 * time) + 20.0;
    vec2 position2 = (((gl_FragCoord.xy / resolution) - 0.5) * scale);
    float gradient = 0.0;
    //vec3 color = vec3(100.0); // not used
    //float fade = 0.0; // this fade is not used.
    float z = 0.0;
    vec2 centered_coord = position2;// - vec2(sin(time*0.1),sin(time*0.1)); //what does this do?
    //vec2 centered_coord = position2 - vec2(sin(time*0.4),sin(time*0.4));  // it makes you move along x y plane back n forth.
    //centered_coord = rotate(centered_coord, rads); // rotates about z.

    for (float i=1.0; i<=100.0; i++)
    {
        //vec2 star_pos = vec2(sin(i) * 250.0, sin(i*i*i) * 250.0);
        vec2 star_pos = vec2(sin(i) * 1000.0, sin(i*i*i) * 1000.0);

        //vec2 star_pos = vec2(sin(i) * 250.0, sin(i*i*i) * 1000.0);
        //vec2 star_pos = vec2( sin(i)*250.0, i  );
        //float z = mod(i*i - 50.0*time, 256.0); // x *time is speed.
        //float z = mod(i - 50.0*time, 256.0); // i is ?? but if not there..it will  .
        //float z = mod(i*i*i*i*i - 10.0*time, 256.0); // i is ?? but if not there..it will  .
        //float z = mod(i- 10.0*time, 1024.0); // i is ?? but if not there..it will  .
        float z = mod(i*i- 10.0*time, 256.0); // lava lamp also happens here..

        float fade = (256.0 - z) /256.0;
        //float fade = (1024.0 - z) /256.0; // cool effect almost lava lamp
        vec2 blob_coord = star_pos / z;
        gradient += ((fade / 384.0) / pow(length(centered_coord - blob_coord), 1.5)) * ( fade);
    }

    //curr_color = vec3(0,0,0);// only does stars
    //curr_color += gradient; // if comment out, only does nebula

    gl_FragColor = vec4(curr_color, 1.0); // might be alpha
    //gl_FragColor = vec4(gradient, 1.0); // does not work
}

also providing the pde code

/**
 * Nebula.
 *
 * From CoffeeBreakStudios.com (CBS)
 * Ported from the webGL version in GLSL Sandbox:
 * http://glsl.heroku.com/e#3265.2
 */

PShader nebula;

void setup() {
  fullScreen( P2D);
  //size(500,500, P2D);
  noStroke();

  nebula = loadShader("nebula.glsl");
  nebula.set("resolution", float(width), float(height));
}

void draw() {
  nebula.set("time", millis() / 500.0);
  shader(nebula);
  // This kind of raymarching effects are entirely implemented in the
  // fragment shader, they only need a quad covering the entire view
  // area so every pixel is pushed through the shader.
  rect(0, 0, width, height);

  resetShader();
  text("fr: " + frameRate, width/2, 10);
}

Extracting per-pixel shader information for a face

$
0
0

Hi all, I'm hoping this isn't too horribly complicated, but I'd like to figure out how to get the per-pixel values for a 3D face after applying lighting. I would like to use this to map individual faces to a PGraphics buffer that I then projection map onto an object. To put it another way, I'd like to bake the lighting for each face as I step through the frames of a sketch to an unwrapped 2D texture that I can access, pixel by pixel, or otherwise manipulate as a PGraphic or PImage. I think PShaders are the ticket, but I can't figure out where to start.

Bestn Drew Hamilton

How to make objects in the foreground block objects in the background in 3D

$
0
0

I am trying to draw an object in 3D, using the P3D renderer, but many surfaces are being drawn in the wrong order, despite depth test and depth sort being enabled. I am getting a picture like this:

Screen Shot 2017-01-11 at 6.01.53 PM

There does not seem to be any logical explanation for the draw order being used, can someone please explain how to fix this?


Processing Video With A GLSL Shader And Saving The Video

$
0
0

How do you do the following with Processing?

  1. Load a video file.
  2. Apply a GLSL Shader to each frame.
  3. Save the processed video as a new file, complete with audio track?

Is Processing suitable for this? Can GLSL in processing use GPU acceleration?

Can anybody point me to an example project where video is loaded, GLSL processed and saved again?

Thanks!

Changing the speed in a shader example

$
0
0

I have this shader code taken from The Book of Shaders.

I'm trying to figure out how to increase the velocity of these lines based on the mouse y position. However, when I do this, it seems to be making this strange "scrubbing" type of effect where as i move the mouse it speeds up (or slows down ). and then only does it set the velocity correctly. what i would prefer to have is each of these lines just moving at a speed that is proportional to the y position of the mouse. I've been trying to tweak things here and there, but can't seem to figure out how to make this happen. have been having simlar problems with other sketches. so i'm wondering what it is i'm not understanding.

// Author @patriciogv - 2015
// Title: Ikeda Data Stream

#ifdef GL_ES
precision mediump float;
#endif

uniform vec2 u_resolution;
uniform vec2 u_mouse;
uniform float u_time;
uniform float u_spd;

float random (in float x) {
    return fract(sin(x)*1e4);
}

float random (in vec2 st) {
    return fract(sin(dot(st.xy, vec2(12.9898,78.233)))* 43758.5453123);
}

float pattern(vec2 st, vec2 v, float t) {
    vec2 p = floor(st+v);
    return step(t, random(100.+p*.000001)+random(p.x)*0.5 );
}

void main() {
    vec2 st = gl_FragCoord.xy/u_resolution.xy;
    st.x *= u_resolution.x/u_resolution.y;

    vec2 grid = vec2(100.0,50.);
    st *= grid;

    vec2 ipos = floor(st);  // integer
    vec2 fpos = fract(st);  // fraction

    //float spdfactor=u_mouse.y/u_resolution.y;
    float spdfactor=u_spd;// does the same thing.


    //vec2 vel = vec2(u_time*2.*max(grid.x,grid.y)); // time
    vec2 vel = vec2(u_time*spdfactor*2.*max(grid.x,grid.y)); // time
    //vec2 vel = vec2(2.*max(grid.x,grid.y));
    //vel *= vec2(-1.,0.0) * random(1.0+ipos.y); // direction // Assign a random value base on the integer coord
    //vel *= vec2(-1.,0.0) *random(1.0+ipos.y); // direction

    vel *= vec2(-1.,0.0) * (ipos.y+1)/100; // direction    // Assign a random value base on the integer coord


    //offset means shift the rgb  channels a bit.
    vec2 offset = vec2(0.6,0.);
    //vec2 offset = vec2(ipos.x,0.);//interesting effect but not what is wnated.

    vec3 color = vec3(0.);
    color.r = pattern(st+offset,vel,0.5+u_mouse.x/u_resolution.x);
    color.g = pattern(st,vel,0.5+u_mouse.x/u_resolution.x);
    color.b = pattern(st-offset,vel,0.5+u_mouse.x/u_resolution.x);

    //color.r = pattern(st+offset,vel,u_time+0.5+u_mouse.x/u_resolution.x);
    //color.g = pattern(st,vel,u_time+0.5+u_mouse.x/u_resolution.x);
    //color.b = pattern(st-offset,vel,u_time+0.5+u_mouse.x/u_resolution.x);

    // Margins
    color *= step(0.5,fpos.y);

    //gl_FragColor = vec4(1.0-color,1.0);
    gl_FragColor = vec4(color,1.0);
}

and here is the pde code PShader shader;

void setup() {
  size(640, 360, P2D);
  noStroke();

  shader = loadShader("shader7.frag"); // one thru 13
}

float m;
void draw() {
  shader.set("u_resolution", float(width), float(height));
  shader.set("u_mouse", float(mouseX), float(mouseY));
  shader.set("u_time", millis() / 1000.0);
  shader.set("u_spd", m);
  shader(shader);
  rect(0, 0, width, height);

  m=map(mouseY, 0, height, 0, 1);
}

Game Objects (like projectiles) with Shaders?

$
0
0

I have a simple game engine architecture and I would like to add effects to my projectiles.
Are shaders typically applied over the entire screen, or is it possible to just apply a shader locally to , let's say, a sprite which is drawn using a. an image or b. some combination of vector drawings like ellipses or custom shapes or an svg file?

How might you prevent it from looking like just a rectangle over the image location? Is there a way to apply shaders only to that particular sprite? Would you just make sure that every non 'important' pixel in the image has some alpha value of 0? Just curious as most of the shader examples i've seen always fill up the sketch area.

Things that I'd like to explore would be making glowing effects and such on every projectile, or trails. Thank you.

GL_MIRROR_REPEAT in texture wrap

$
0
0

Hello, I can't figure out how to set GL_TEXTURE_WRAP_x to GL_MIRRORED_REPEAT Apparently the only two methods supported by textureWrap() (as documented) are GL_CLAMP_TO_EDGE and GL_REPEAT. I tried to dig in the source code, but I can't understand what textureWrap exactly does.

Thank you.

What is wrong with this shader?

$
0
0

I do have a working processing script but the shader that I want to use (Multiple Lights with just color) returns errors:

Cannot link Shader program: Vertex info: 0(45) error C7623: implicit narrowing of type vec4 to float

I have no idea how I can fix this as I do not understand the error in this program. Thanks for help!

Vertex Shader

#define PROCESSING_LIGHT_SHADER
#define NUM_LIGHTS 8

uniform mat4 modelview;
uniform mat4 transform;
uniform mat3 normalMatrix;

uniform int lightCount;
uniform vec4 lightPosition[8];

// focal factor for specular highlights (positive floats)
uniform vec4 AmbientContribution;

in vec4 vertex;//attribute same as "in"
in vec4 color;
in vec3 normal;

varying vec4 vertColor;

void main(){
     //Vertex normal direction

    float light;

    for (int i = 0; i < lightCount; i++){

        gl_Position = transform*vertex;
        vec3 vertexCamera = vec3(modelview * vertex);
        vec3 transformedNormal = normalize(normalMatrix * normal);
        light = 0.0f;

        vec3 dir = normalize(lightPosition[i].xyz - vertexCamera); //Vertex to light direction
        float amountDiffuse = max(0.0, dot(dir, transformedNormal));
        // calculate the vertex position in eye coordinates
        vec3 vertexViewDir = normalize(-vertexCamera);

        // calculate the vector corresponding to the light source reflected in the vertex surface.
        // lightDir is negated as GLSL expects an incoming (rather than outgoing) vector
        vec3 lightReflection = reflect(-dir, transformedNormal);

        //color=clamp(color,0.0f,1.0f);

        // calculate actual light intensity
        light += AmbientContribution;
        light += 0.25 * amountDiffuse;


    }
    vertColor = vec4(light, light, light, 1) * color;
}

Fragment Shader

#ifdef GL_ES
precision mediump float;
precision mediump int;
#endif

varying vec4 vertColor;

void main() {
  gl_FragColor = vertColor;
}
Viewing all 212 articles
Browse latest View live