Nasso Nasso - 1 year ago 184
Java Question

OpenGL texture repeat artifacts

I'm using OpenGL (4.5 core, with LWJGL 3.0.0 build 90) and I noticed some artifacts on textures using

wrap mode with a high amount of repetitions:

Texture artifacts

What can causes this, and how can I fix it (if I can) ?

Here, the plane's size is 100x100, and the UVs are 10000x10000. This screenshot is really really close to it (from farther, the texture is so small that we just see a flat gray color), the near plane is at 0.0001 and the far plane at 10.
I'm not sure if the problem is in the Depth Buffer since the OpenGL default one has a really high precision at closer distances.

(EDIT: I'm thinking of a floating point error on texture coordinates, but I'm not sure)

Here's my shader (I'm using deferred rendering, and the texture sampling is in the geometry pass, so I give the geometry pass shader only).

Vertex shader:

#version 450 core

uniform mat4 projViewModel;
uniform mat4 viewModel;
uniform mat3 normalView;

in vec3 normal_model;
in vec3 position_model;
in vec2 uv;
in vec2 uv2;

out vec3 pass_position_view;
out vec3 pass_normal_view;
out vec2 pass_uv;
out vec2 pass_uv2;

void main(){
pass_position_view = (viewModel * vec4(position_model, 1.0)).xyz;
pass_normal_view = normalView * normal_model;
pass_uv = uv;
pass_uv2 = uv2;

gl_Position = projViewModel * vec4(position_model, 1.0);

Fragment shader:

#version 450 core

struct Material {
sampler2D diffuseTexture;
sampler2D specularTexture;

vec3 diffuseColor;

float uvScaling;
float shininess;
float specularIntensity;

bool hasDiffuseTexture;
bool hasSpecularTexture;
bool faceSideNormalCorrection;

uniform Material material;

in vec3 pass_position_view;
in vec3 pass_normal_view;
in vec2 pass_uv;
in vec2 pass_uv2;

layout(location = 0) out vec4 out_diffuse;
layout(location = 1) out vec4 out_position;
layout(location = 2) out vec4 out_normal;

void main(){
vec4 diffuseTextureColor = vec4(1.0);

diffuseTextureColor = texture(material.diffuseTexture, pass_uv * material.uvScaling);

float specularTextureIntensity = 1.0;

specularTextureIntensity = texture(material.specularTexture, pass_uv * material.uvScaling).x;

vec3 fragNormal = pass_normal_view;
if(material.faceSideNormalCorrection && !gl_FrontFacing){
fragNormal = -fragNormal;

out_diffuse = vec4(diffuseTextureColor.rgb * material.diffuseColor, material.shininess);
out_position = vec4(pass_position_view, 1.0); // Must be 1.0 on the alpha -> 0.0 = sky
out_normal = vec4(fragNormal, material.specularIntensity * specularTextureIntensity);

Yes, I know, the eye space position is useless in the G-Buffer since you can compute it later from the depth buffer. I just did this for now but it's temporary.
Also, if anything in my shaders is deprecated or a bad practice, it would be cool if you tell me what to do instead ! Thanks !

Additionnal infos (most of them useless I think):

  1. Camera: FOV = 70°, Ratio = 16/9, Near = 0.0001, Far = 10

  2. OpenGL: Major = 4, Minor = 5, Profile = Core

  3. Texture: InternalFormat =
    , Filters = Anisotropic, Trilinear

  4. Hardware: GPU = NVIDIA GeForce GTX 970, CPU = Interl(R) Core(TM) i7-4790K CPU @ 4.00GHz, Memory = 16.00 GB RAM (15.94 GB usable), Screen = 1920 x 1080 @ 144Hz

  5. Driver: GeForce Game Ready Driver V368.69 (release: 6 July 2016)

Answer Source

This is most likely due to floating-point imprecisions created during rasterization (interpolation, perspective correction) and worsened by the normalization in the fragment shader to fetch the correct texels.

But this is also a problem with mipmaps : to calculate which level to use, the UV of adjacent pixels are retrieved to know if the texture is stretched or compressed on screen. Because of the imprecisions, adjacent pixels share the same UV, thus the differences between them (or the 'partial derivatives') are null. This makes the texture() function sample the lowest mipmap level on those identical pixels, thus creating discontinuities.

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download