Skip to content

Difficulty drawing pixel-accurate images in iOS 10.1 GLES2 #5381

@kylemcdonald

Description

@kylemcdonald

When I draw an image that's made up of alternating black and white pixels like this:

    for(int y = 0; y < h; y++) {
        for(int x = 0; x < w; x++) {
            bool on = x % 2 == 0 && y % 2 == 0;
            img.setColor(x, y, on ? ofColor::white : ofColor::black);
        }
    }

(Inverse of the pattern here.) Then I get moire patterns.

screen shot 2016-12-02 at 7 20 51 pm

The above test is with ES2 and with 4-sample antialiasing enabled. Disabling antialiasing changes the pattern but does not remove the moire. ES1 does not show a moire pattern regardless of whether antialiasing is true or false.

I checked out the latest OF, pulled libs, and created a project with the project generator (copied from the 0.9.8-ios release). I'm running iOS 10.1 and XCode 8.1.

If I add the line ofSetupScreenOrtho(ofGetWidth(), ofGetHeight(), -1, +1); at the top of draw() then everything is ok on the real hardware. In the simulator the pattern is still "blurry" (instead of a 2x2 square being [255, 0, 0, 0] it's [149, 43, 43, 17]) but there is no moire. I tried checking if the difference between the hardware and the simulator is due to floating point accuracy, but I think it must be the graphics card. Because the matrices computed by ofSetupScreenOrtho() are exactly the same.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions