Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integer overflow in gdImageScaleBilinearPalette() #330

cmb69 opened this Issue Oct 10, 2016 · 0 comments


None yet
1 participant
Copy link

commented Oct 10, 2016

On platforms where char is actually signed char an integer overflow can happen in gdImageScaleBilinearPalette due to sign extension when red, green and blue are passed to gdTrueColorAlpha().

Test Program

#include <stdio.h>
#include <gd.h>

int main()
    gdImagePtr src, dst;

    src = gdImageCreate(100, 100);
    gdImageColorAllocate(src, 255, 255, 255);

    gdImageSetInterpolationMethod(src, GD_BILINEAR_FIXED);
    dst = gdImageScale(src, 200, 200);

    printf("color: %d\n", gdImageGetPixel(dst, 99, 99));


    return 0;

Expected Output

color: 16777215

Actual Output

color: -65793

@cmb69 cmb69 added the bug label Oct 10, 2016

@cmb69 cmb69 added this to the GD 2.2.4 milestone Oct 10, 2016

@cmb69 cmb69 self-assigned this Oct 10, 2016

@cmb69 cmb69 closed this in 77c8d35 Oct 10, 2016

cmb69 added a commit that referenced this issue Oct 10, 2016

Fix #330: Integer overflow in gdImageScaleBilinearPalette()
The color components are supposed to be in range 0..255, so we must not
cast them to `signed char`, what can be the default for `char`.

(cherry picked from commit 77c8d35)

# Conflicts:
#	tests/gdimagescale/CMakeLists.txt
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.