Skip to content

Commit

Permalink
Merge remote-tracking branch 'upstream/master'
Browse files Browse the repository at this point in the history
Conflicts:
	framework/GPUImage.xcodeproj/project.pbxproj
  • Loading branch information
fattjake committed May 14, 2012
2 parents 621d5da + fa31df5 commit b10f83f
Show file tree
Hide file tree
Showing 121 changed files with 2,240 additions and 831 deletions.
7 changes: 6 additions & 1 deletion .gitignore
@@ -1,5 +1,6 @@
# Exclude the build directory
build/*
examples/FilterShowcase/build*

# Exclude temp nibs and swap files
*~.nib
Expand All @@ -16,4 +17,8 @@ build/*
*.perspectivev3
*.pbxuser
*.xcworkspace
xcuserdata
xcuserdata

# Documentation
documentation/*

55 changes: 32 additions & 23 deletions README.md
Expand Up @@ -46,6 +46,10 @@ For example, an application that takes in live video from the camera, converts t

GPUImageVideoCamera -> GPUImageSepiaFilter -> GPUImageView

## Documentation ##

Documentation is generated from header comments using appledoc. To build the documentation, switch to the "Documentation" scheme in Xcode. You should ensure that "APPLEDOC_PATH" (a User-Defined build setting) points to an appledoc binary, available on <a href="https://github.com/tomaz/appledoc">Github</a> or through <a href="https://github.com/mxcl/homebrew">Homebrew</a>. It will also build and install a .docset file, which you can view with your favorite documentation tool.

## Built-in filters ##

### Color adjustments ###
Expand Down Expand Up @@ -88,8 +92,6 @@ For example, an application that takes in live video from the camera, converts t

### Image processing ###

- **GPUImageRotationFilter**: This lets you rotate an image left or right by 90 degrees, or flip it horizontally or vertically

- **GPUImageTransformFilter**: This applies an arbitrary 2-D or 3-D transformation to an image
- *affineTransform*: This takes in a CGAffineTransform to adjust an image in 2-D
- *transform3D*: This takes in a CATransform3D to manipulate an image in 3-D
Expand Down Expand Up @@ -174,30 +176,34 @@ For example, an application that takes in live video from the camera, converts t
- *center*: The center about which to apply the pixellation, defaulting to (0.5, 0.5)
- *pixelSize*: The fractional pixel size, split into width and height components. The default is (0.05, 0.05)

- **GPUImageCrosshatchFilter**: This converts an image into a black-and-white crosshatch pattern
- *crossHatchSpacing*: The fractional width of the image to use as the spacing for the crosshatch. The default is 0.03.
- *lineWidth*: A relative width for the crosshatch lines. The default is 0.003.

- **GPUImageSobelEdgeDetectionFilter**: Sobel edge detection, with edges highlighted in white
- *imageWidthFactor*:
- *imageHeightFactor*: These parameters affect the visibility of the detected edges
- *texelWidth*:
- *texelHeight*: These parameters affect the visibility of the detected edges

- **GPUImageCannyEdgeDetectionFilter**: This uses a Gaussian blur before applying a Sobel operator to highlight edges
- *imageWidthFactor*:
- *imageHeightFactor*: These parameters affect the visibility of the detected edges
- *texelWidth*:
- *texelHeight*: These parameters affect the visibility of the detected edges
- *blurSize*: A multiplier for the prepass blur size, ranging from 0.0 on up, with a default of 1.0
- *threshold*: Any edge above this threshold will be black, and anything below white. Ranges from 0.0 to 1.0, with 0.5 as the default

- **GPUImageSketchFilter**: Converts video to look like a sketch. This is just the Sobel edge detection filter with the colors inverted
- *intensity*: The degree to which the original image colors are replaced by the detected edges (0.0 - 1.0, with 1.0 as the default)
- *imageWidthFactor*:
- *imageHeightFactor*: These parameters affect the visibility of the detected edges
- *texelWidth*:
- *texelHeight*: These parameters affect the visibility of the detected edges

- **GPUImageToonFilter**: This uses Sobel edge detection to place a black border around objects, and then it quantizes the colors present in the image to give a cartoon-like quality to the image.
- *imageWidthFactor*:
- *imageHeightFactor*: These parameters affect the visibility of the detected edges
- *texelWidth*:
- *texelHeight*: These parameters affect the visibility of the detected edges
- *threshold*: The sensitivity of the edge detection, with lower values being more sensitive. Ranges from 0.0 to 1.0, with 0.2 as the default
- *quantizationLevels*: The number of color levels to represent in the final image. Default is 10.0

- **GPUImageSmoothToonFilter**: This uses a similar process as the GPUImageToonFilter, only it precedes the toon effect with a Gaussian blur to smooth out noise.
- *imageWidthFactor*:
- *imageHeightFactor*: These parameters affect the visibility of the detected edges
- *texelWidth*:
- *texelHeight*: These parameters affect the visibility of the detected edges
- *blurSize*: A multiplier for the prepass blur size, ranging from 0.0 on up, with a default of 0.5
- *threshold*: The sensitivity of the edge detection, with lower values being more sensitive. Ranges from 0.0 to 1.0, with 0.2 as the default
- *quantizationLevels*: The number of color levels to represent in the final image. Default is 10.0
Expand Down Expand Up @@ -225,7 +231,7 @@ For example, an application that takes in live video from the camera, converts t

- **GPUImageVignetteFilter**: Performs a vignetting effect, fading out the image at the edges
- *x*:
- *y*: The directional intensity of the vignetting, with a default of x = 0.5, y = 0.75
- *y*: The directional intensity of the vignetting, with a default of x = 0.75, y = 0.5

- **GPUImageKuwaharaFilter**: Kuwahara image abstraction, drawn from the work of Kyprianidis, et. al. in their publication "Anisotropic Kuwahara Filtering on the GPU" within the GPU Pro collection. This produces an oil-painting-like image, but it is extremely computationally expensive, so it can take seconds to render a frame on an iPad 2. This might be best used for still images.
- *radius*: In integer specifying the number of pixels out from the center pixel to test when applying the filter, with a default of 4. A higher value creates a more abstracted image, but at the cost of much greater processing time.
Expand Down Expand Up @@ -264,6 +270,8 @@ Additionally, this is an ARC-enabled framework, so if you want to use this withi
To filter live video from an iOS device's camera, you can use code like the following:

GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;

GPUImageFilter *customFilter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:@"CustomShader"];
GPUImageView *filteredVideoView = [[GPUImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, viewWidth, viewHeight)];

Expand All @@ -274,7 +282,7 @@ To filter live video from an iOS device's camera, you can use code like the foll

[videoCamera startCameraCapture];

This sets up a video source coming from the iOS device's back-facing camera, using a preset that tries to capture at 640x480. A custom filter, using code from the file CustomShader.fsh, is then set as the target for the video frames from the camera. These filtered video frames are finally displayed onscreen with the help of a UIView subclass that can present the filtered OpenGL ES texture that results from this pipeline.
This sets up a video source coming from the iOS device's back-facing camera, using a preset that tries to capture at 640x480. This video is captured with the interface being in portrait mode, where the landscape-left-mounted camera needs to have its video frames rotated before display. A custom filter, using code from the file CustomShader.fsh, is then set as the target for the video frames from the camera. These filtered video frames are finally displayed onscreen with the help of a UIView subclass that can present the filtered OpenGL ES texture that results from this pipeline.

The fill mode of the GPUImageView can be altered by setting its fillMode property, so that if the aspect ratio of the source video is different from that of the view, the video will either be stretched, centered with black bars, or zoomed to fill.

Expand All @@ -290,11 +298,10 @@ Also, if you wish to enable microphone audio capture for recording to a movie, y
To capture and filter still photos, you can use a process similar to the one for filtering video. Instead of a GPUImageVideoCamera, you use a GPUImageStillCamera:

stillCamera = [[GPUImageStillCamera alloc] init];
stillCamera.outputImageOrientation = UIInterfaceOrientationPortrait;

filter = [[GPUImageGammaFilter alloc] init];
GPUImageRotationFilter *rotationFilter = [[GPUImageRotationFilter alloc] initWithRotation:kGPUImageRotateRight];

[stillCamera addTarget:rotationFilter];
[rotationFilter addTarget:filter];
[stillCamera addTarget:filter];
GPUImageView *filterView = (GPUImageView *)self.view;
[filter addTarget:filterView];

Expand Down Expand Up @@ -380,14 +387,12 @@ One thing to note when adding fragment shaders to your Xcode project is that Xco

Movies can be loaded into the framework via the GPUImageMovie class, filtered, and then written out using a GPUImageMovieWriter. GPUImageMovieWriter is also fast enough to record video in realtime from an iPhone 4's camera at 640x480, so a direct filtered video source can be fed into it.

The following is an example of how you would load a sample movie, pass it through a pixellation and rotation filter, then record the result to disk as a 480 x 640 h.264 movie:
The following is an example of how you would load a sample movie, pass it through a pixellation filter, then record the result to disk as a 480 x 640 h.264 movie:

movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
pixellateFilter = [[GPUImagePixellateFilter alloc] init];
GPUImageRotationFilter *rotationFilter = [[GPUImageRotationFilter alloc] initWithRotation:kGPUImageRotateRight];

[movieFile addTarget:rotationFilter];
[rotationFilter addTarget:pixellateFilter];
[movieFile addTarget:pixellateFilter];

NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]);
Expand Down Expand Up @@ -428,6 +433,10 @@ A bundled JPEG image is loaded into the application at launch, a filter is appli

A pixellate filter is applied to a live video stream, with a UISlider control that lets you adjust the pixel size on the live video.

### SimpleVideoFileFilter ###

A movie file is loaded from disk, an unsharp mask filter is applied to it, and the filtered result is re-encoded as another movie.

### MultiViewFilterExample ###

From a single camera feed, four views are populated with realtime filters applied to camera. One is just the straight camera video, one is a preprogrammed sepia tone, and two are custom filters based on shader programs.
Expand All @@ -450,4 +459,4 @@ In other words, the path of this application is camera -> sepia tone filter -> c

A version of my ColorTracking example from http://www.sunsetlakesoftware.com/2010/10/22/gpu-accelerated-video-processing-mac-and-ios ported across to use GPUImage, this application uses color in a scene to track objects from a live camera feed. The four views you can switch between include the raw camera feed, the camera feed with pixels matching the color threshold in white, the processed video where positions are encoded as colors within the pixels passing the threshold test, and finally the live video feed with a dot that tracks the selected color. Tapping the screen changes the color to track to match the color of the pixels under your finger. Tapping and dragging on the screen makes the color threshold more or less forgiving. This is most obvious on the second, color thresholding view.

Currently, all processing for the color averaging in the last step is done on the CPU, so this is part is extremely slow.
Currently, all processing for the color averaging in the last step is done on the CPU, so this is part is extremely slow.
Expand Up @@ -170,12 +170,11 @@ - (void)displayVideoForGPUImage;
NSLog(@"Start GPU Image");
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.runBenchmark = YES;
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;

sepiaFilter = [[GPUImageSepiaFilter alloc] init];
GPUImageRotationFilter *rotationFilter = [[GPUImageRotationFilter alloc] initWithRotation:kGPUImageRotateRight];

[videoCamera addTarget:rotationFilter];
[rotationFilter addTarget:sepiaFilter];
[videoCamera addTarget:sepiaFilter];
filterView = [[GPUImageView alloc] initWithFrame:self.view.bounds];
[self.view addSubview:filterView];
[sepiaFilter addTarget:filterView];
Expand Down
Expand Up @@ -8,7 +8,7 @@ typedef enum { PASSTHROUGH_VIDEO, SIMPLE_THRESHOLDING, POSITION_THRESHOLDING, OB
CALayer *trackingDot;

GPUImageVideoCamera *videoCamera;
GPUImageFilter *rotationFilter, *thresholdFilter, *positionFilter;
GPUImageFilter *thresholdFilter, *positionFilter;
GPUImageRawData *positionRawData, *videoRawData;
GPUImageView *filteredVideoView;

Expand Down
Expand Up @@ -49,6 +49,7 @@ - (void)configureVideoFiltering;
{
CGRect mainScreenFrame = [[UIScreen mainScreen] applicationFrame];
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
filteredVideoView = [[GPUImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, mainScreenFrame.size.width, mainScreenFrame.size.height)];
[self.view addSubview:filteredVideoView];

Expand All @@ -58,7 +59,6 @@ - (void)configureVideoFiltering;
positionFilter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:@"PositionColor"];
[positionFilter setFloat:thresholdSensitivity forUniform:@"threshold"];
[positionFilter setFloatVec3:thresholdColor forUniform:@"inputColor"];
rotationFilter = [[GPUImageRotationFilter alloc] initWithRotation:kGPUImageRotateRight];

// CGSize videoPixelSize = filteredVideoView.bounds.size;
// videoPixelSize.width *= [filteredVideoView contentScaleFactor];
Expand All @@ -72,12 +72,8 @@ - (void)configureVideoFiltering;
videoRawData = [[GPUImageRawData alloc] initWithImageSize:videoPixelSize];
videoRawData.delegate = self;

[videoCamera addTarget:rotationFilter];
[rotationFilter addTarget:filteredVideoView];
[rotationFilter addTarget:videoRawData];
// [rotationFilter addTarget:positionFilter];
// [positionFilter addTarget:filteredVideoView];
// [positionFilter addTarget:videoRawData];
[videoCamera addTarget:filteredVideoView];
[videoCamera addTarget:videoRawData];

[videoCamera startCameraCapture];
}
Expand Down Expand Up @@ -147,31 +143,31 @@ - (void)handleSwitchOfDisplayMode:(id)sender;
trackingDot.opacity = 0.0f;
}

[rotationFilter removeAllTargets];
[videoCamera removeAllTargets];
[positionFilter removeAllTargets];
[thresholdFilter removeAllTargets];
[rotationFilter addTarget:videoRawData];
[videoCamera addTarget:videoRawData];

switch(displayMode)
{
case PASSTHROUGH_VIDEO:
{
[rotationFilter addTarget:filteredVideoView];
[videoCamera addTarget:filteredVideoView];
}; break;
case SIMPLE_THRESHOLDING:
{
[rotationFilter addTarget:thresholdFilter];
[videoCamera addTarget:thresholdFilter];
[thresholdFilter addTarget:filteredVideoView];
}; break;
case POSITION_THRESHOLDING:
{
[rotationFilter addTarget:positionFilter];
[videoCamera addTarget:positionFilter];
[positionFilter addTarget:filteredVideoView];
}; break;
case OBJECT_TRACKING:
{
[rotationFilter addTarget:filteredVideoView];
[rotationFilter addTarget:positionFilter];
[videoCamera addTarget:filteredVideoView];
[videoCamera addTarget:positionFilter];
[positionFilter addTarget:positionRawData];
}; break;
}
Expand Down
5 changes: 2 additions & 3 deletions examples/CubeExample/Classes/ES2Renderer.m
Expand Up @@ -72,13 +72,12 @@ - (id)initWithSize:(CGSize)newSize;


videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
inputFilter = [[GPUImageSepiaFilter alloc] init];
GPUImageRotationFilter *rotationFilter = [[GPUImageRotationFilter alloc] initWithRotation:kGPUImageRotateRight];
textureOutput = [[GPUImageTextureOutput alloc] init];
textureOutput.delegate = self;

[videoCamera addTarget:rotationFilter];
[rotationFilter addTarget:inputFilter];
[videoCamera addTarget:inputFilter];
[inputFilter addTarget:textureOutput];

[videoCamera startCameraCapture];
Expand Down
Expand Up @@ -65,6 +65,7 @@ - (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(N
case GPUIMAGE_CONTRAST: cell.textLabel.text = @"Contrast"; break;
case GPUIMAGE_BRIGHTNESS: cell.textLabel.text = @"Brightness"; break;
case GPUIMAGE_EXPOSURE: cell.textLabel.text = @"Exposure"; break;
case GPUIMAGE_RGB: cell.textLabel.text = @"RGB"; break;
case GPUIMAGE_SHARPEN: cell.textLabel.text = @"Sharpen"; break;
case GPUIMAGE_UNSHARPMASK: cell.textLabel.text = @"Unsharp mask"; break;
case GPUIMAGE_GAMMA: cell.textLabel.text = @"Gamma"; break;
Expand Down Expand Up @@ -115,10 +116,12 @@ - (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(N
case GPUIMAGE_SOFTLIGHTBLEND: cell.textLabel.text = @"Soft light blend"; break;
case GPUIMAGE_KUWAHARA: cell.textLabel.text = @"Kuwahara"; break;
case GPUIMAGE_VIGNETTE: cell.textLabel.text = @"Vignette"; break;
case GPUIMAGE_GAUSSIAN: cell.textLabel.text = @"Gaussian Blur"; break;
case GPUIMAGE_FASTBLUR: cell.textLabel.text = @"Fast Blur"; break;
case GPUIMAGE_BOXBLUR: cell.textLabel.text = @"Box Blur"; break;
case GPUIMAGE_GAUSSIAN_SELECTIVE: cell.textLabel.text = @"Gaussian Selective Blur"; break;
case GPUIMAGE_GAUSSIAN: cell.textLabel.text = @"Gaussian blur"; break;
case GPUIMAGE_FASTBLUR: cell.textLabel.text = @"Fast blur"; break;
case GPUIMAGE_MEDIAN: cell.textLabel.text = @"Median (3x3)"; break;
case GPUIMAGE_BILATERAL: cell.textLabel.text = @"Bilateral blur"; break;
case GPUIMAGE_BOXBLUR: cell.textLabel.text = @"Box blur"; break;
case GPUIMAGE_GAUSSIAN_SELECTIVE: cell.textLabel.text = @"Gaussian selective blur"; break;
case GPUIMAGE_CUSTOM: cell.textLabel.text = @"Custom"; break;
case GPUIMAGE_FILECONFIG: cell.textLabel.text = @"Filter Chain"; break;
case GPUIMAGE_FILTERGROUP: cell.textLabel.text = @"Filter Group"; break;
Expand Down
Expand Up @@ -6,6 +6,7 @@ typedef enum {
GPUIMAGE_CONTRAST,
GPUIMAGE_BRIGHTNESS,
GPUIMAGE_EXPOSURE,
GPUIMAGE_RGB,
GPUIMAGE_SHARPEN,
GPUIMAGE_UNSHARPMASK,
GPUIMAGE_TRANSFORM,
Expand Down Expand Up @@ -42,6 +43,8 @@ typedef enum {
GPUIMAGE_GAUSSIAN_SELECTIVE,
GPUIMAGE_FASTBLUR,
GPUIMAGE_BOXBLUR,
GPUIMAGE_MEDIAN,
GPUIMAGE_BILATERAL,
GPUIMAGE_SWIRL,
GPUIMAGE_BULGE,
GPUIMAGE_PINCH,
Expand Down

0 comments on commit b10f83f

Please sign in to comment.