Skip to content

Commit

Permalink
spring18
Browse files Browse the repository at this point in the history
  • Loading branch information
chriszielinski committed Apr 12, 2018
1 parent 639ee4d commit ee8c579
Show file tree
Hide file tree
Showing 4 changed files with 35 additions and 32 deletions.
32 changes: 17 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,22 +34,22 @@ This assignment builds off hw3 pt2. Instead of using your own Firebase console,

Begin by downloading the repo, and opening "Snapchat Camera Lab.xcworkspace". Having finished homwork 3, you should already be familiar with the code provided.

For this lab, **you will only be editing ImagePickerViewController.swift and it's corresponding View Controller in Storyboard.**
For this lab, **you will only be editing CameraViewController.swift and it's corresponding View Controller in Storyboard.**

## Part 1: Connecting your iOS Device ##

Before writing any code, you'll need to connect your iPhone or iPad to your computer via a USB cable. Once you've done that, tap on the simulators drop down, and select your device name

![](/README-images/README-1.png)

Try building. You'll see an error pop up saying "Signing for "snapChatProject" requires a development team. Select a development team in the project editor." To fix this, you'll need to add a signing account, which you can do using an Apple ID. To set your Team Profile, **Open your project file in Xcode, and click "Add Account" in the Team dropdown (see image)**.
Try building. You'll see an error pop up saying "Signing for "Snapchat Camera Lab" requires a development team. Select a development team in the project editor." To fix this, you'll need to add a signing account, which you can do using an Apple ID. To set your Team Profile, **Open your project file in Xcode, and click "Add Account" in the Team dropdown (see image)**.

![](/README-images/README-2.png)

Once you've set your Team as your Apple ID, try running your app again on your device. If everything's working, you'll see a blank gray view with some buttons (that don't work yet). If you still are having an issue, ask one of the TA's for help (you may need to change your deployment target from 11.1 to 11.0 if your phone is not running the lastest iOS version). The first build on your device should take a few minutes to complete, so give Xcode some time to build.

## Part 2: Connecting Outlets ##
Just so you become more familiar with which views are which, we left the outlets and actions in **ImagePickerViewController.swift** unconnected to storyboard. **Go ahead and connect these outlets and actions in ImagePickerViewController.swift to Main.storyboard**. Make sure to read the comments above each IBOutlet and IBAction, so that you are sure you are connecting them correctly.
Just so you become more familiar with which views are which, we left the outlets and actions in **CameraViewController.swift** unconnected to storyboard. **Go ahead and connect these outlets and actions in CameraViewController.swift to Main.storyboard**. Make sure to read the comments above each IBOutlet and IBAction, so that you are sure you are connecting them correctly.

> Tip: may find it helpful to open the **Document Outline**. You can drag from Outlets and Actions in code to UI elements in the Document Outline if you find that easier. If you need to delete any connections you made, tap on your ViewController in the Document Outline or in Storyboard, and open the **Connections Inspector** to see all of your connections and delete any if necessary.
Expand All @@ -71,53 +71,55 @@ var captureSession: AVCaptureSession?
var previewLayer : AVCaptureVideoPreviewLayer?

// used to capture a single photo from our capture device
let photoOutput = AVCapturePhotoOutput()
let photoOutput: AVCapturePhotoOutput? = nil
```

### Part 3.1: Configuring Capture Session - creating a preview layer for displaying video###

Notice the two commented out methods in `viewDidLoad`. These are two helper methods we've defined for the lab to help organize our code, but you will need to fill them out. To begin,

1.instantiate an `AVCaptureSession` object in `viewDidLoad`, and set it equal to `captureSession`. We will be using only one session throughout this lab, so we need to keep a reference to it.
2. uncomment the call to the helper function `createAndLayoutPreviewLayer` in `viewDidLoad`
3. finish implementing the `createPreviewLayer` function. You can learn about preview layers here [AVCapturePreviewLayer](https://developer.apple.com/reference/avfoundation/avcapturepreviewlayer). All you will need to do is create an instance of AVCapturePreviewLayer, using our captureSession, and set it equal to `previewLayer`.
1. instantiate an `AVCaptureSession` object in `viewDidLoad`, and set it equal to `captureSession`. We will be using only one session throughout this lab, so we need to keep a reference to it.
2. uncomment the call to the helper function `createAndLayoutPreviewLayer(fromSession:)` in `viewDidLoad`
3. finish implementing the `createAndLayoutPreviewLayer(fromSession:)` function. You can learn about preview layers here [AVCaptureVideoPreviewLayer](https://developer.apple.com/documentation/avfoundation/avcapturevideopreviewlayer). All you will need to do is create an instance of `AVCaptureVideoPreviewLayer`, using our `captureSession`, and set it equal to `previewLayer`.

Once you've finished this, you've now created a view to display video from your camera. However, you won't see any video showing up yet when you run, since we haven't set our `captureSession`'s input. Let's do that now!

### Part 3.2: Configuring Capture Session - setting inputs and outputs ###
Go back to `viewDidLoad` and uncomment the helper method `configureCaptureSession`. Inside the `configureCaptureSession` function, you will need to do the following:
Go back to `viewDidLoad` and uncomment the helper method `configureCaptureSession(forDevicePosition:)`. Inside the `configureCaptureSession(forDevicePosition:)` function, you will need to do the following:

1. Create an input using a device, and add it to `captureSession`. This will allow us to receive data from the device (camera) while the session is running.
2. Create and add an (`AVCapturePhotoOutput`) output for our session . Use the `photoOutput` variable we defined for you, so that we can access this output outside of this method body.

> Note: The lecture slides will help you if your not sure what to do. Feel free to rename `someConstantWithABadName`.
> Tip: Once we have a *hardware* capture device, we need to create a capture device *input* to feed into our capture session.
Once you've finished, try running your app. You should be able to see video data from your camera in the `previewLayer`. Try taking a photo - looks like we still have a problem (you should still see the squirrel at this point). To fix this, we need to capture the photo from our session input, using our session's output, which we'll do in the next step.

## Part 4: Taking a photo using AVCapturePhoto ##
Now, we'll implement taking a photo, and then displaying this image in our imageView (`imageViewOverlay`). To do this, we will need to use [`AVCapturePhoto`](https://developer.apple.com/documentation/avfoundation/avcapturephoto) and [`AVCapturePhotoCaptureDelegate`](https://developer.apple.com/documentation/avfoundation/avcapturephotocapturedelegate). Specifically, you will need to:

1. [Capture a photo](https://developer.apple.com/documentation/avfoundation/avcapturephotooutput/1648765-capturephoto) using your `photoOutput` object in the `takePhoto` IBAction. Remember, this doesn't save the image anywhere, we will handle this using `AVCapturePhotoCaptureDelegate` in the next step.
1. [Capture a photo](https://developer.apple.com/documentation/avfoundation/avcapturephotooutput/1648765-capturephoto) using your `photoOutput` object in the `takePhotoButtonWasPressed(_:)` IBAction method. Remember, this doesn't save the image anywhere, we will handle this using `AVCapturePhotoCaptureDelegate` in the next step.
2. Define the `AVCapturePhotoCaptureDelegate` method [`photoOutput(didFinishProcessingPhoto: ...)`](https://developer.apple.com/documentation/avfoundation/avcapturephotocapturedelegate/2873949-photooutput). (We did not add this in for you, you will need to create it and make sure your view controller class conforms to the delegate protocol).

Inside of the photoOutput(didFinishProcessingPhoto: ...) function, you will need to do the following (in this order):
Inside of the `photoOutput(didFinishProcessingPhoto: ...)` function, you will need to do the following (in this order):

3. Create a UIImage using the `photo` parameter. [Slide 56](http://iosdecal.com/fall-2017-slides/lecture9.pdf#page=56) from lecture will help you if your stuck.
3. Create a UIImage using the `photo` parameter. [Slide 54](http://iosdecal.com/lectures/2018/Spring/lecture9-avfoundation+location+maps.pdf#page=54) from lecture will help you if your stuck.
4. Set `selectedImage` to this `UIImage`
5. Call `toggleUI(isInPreviewMode: true)`. This is already implemented for you, and it simply updates the UI elements on the screen - (i.e., once the picture is taken, we want to present the send button, hide the camera flip button, etc.)

Once you've finished this step, you should be able to take and send photos!

## Part 5 (OPTIONAL BUT ALSO NOT THAT BAD): Supporting front and back camera ##
Right now, we can only take pictures using the front camera. To add support for toggling between the front camera and the rear facing camera, go back to the function we created at the beginning of the lab, `configureCaptureSession`. Right now `someConstantWithABadName` (which you probably know by is a reference to our front camera device) uses a discovery session to search for all devices with a built in camera.
## Part 5 (OPTIONAL BUT ALSO NOT THAT BAD): Supporting front and back camera ##
Right now, we can only take pictures using the front camera. To add support for toggling between the front camera and the rear facing camera, go back to the function we implemented at the beginning of the lab, `configureCaptureSession(forDevicePosition:)`. Right now `someConstantWithABadName` (which you probably know by now is a reference to our front camera device) uses a discovery session to search for all devices with a built-in camera.

To toggle between the front facing camera and the rear facing camera, you'll need to edit this constant, as well as the method `flipCamera`. Some hints:

1. You'll need to pass in a different `AVCaptureDevice.position` to `someConstantWithABadName`
2. You may (and should) reuse the same `captureSession` when switching devices/cameras (meaning, you should be calling `configureCaptureSession` again within `flipCamera`), but you must remove your old camera input before adding a new one. You can do this by iterating through you `captureSession.inputs` and calling `captureSession.removeInput`.
2. You may (and should) reuse the same `captureSession` when switching devices/cameras (meaning, you should be calling `configureCaptureSession(forDevicePosition:)` again within `flipCamera`), but you must remove your old camera input before adding a new one. You can do this by iterating through `captureSession.inputs` and calling `captureSession.removeInput(_)`.

If you implemented this correctly, you should be able to toggle back and forth between both of your device's cameras. Nice work!


## Grading ##
Once you've finished the lab, you can check-off using this form https://goo.gl/forms/4aXgPQFhdEK5ZDPs2. If you weren't able to finish before 8:30pm, make sure to let a TA know you attended (do not fill out the google form) **and** make sure to get checked off next week.
Once you've finished the lab, you can get checked-off using this form: [https://goo.gl/forms/MurNLD7tpac3uZt83](https://goo.gl/forms/MurNLD7tpac3uZt83). If you weren't able to finish before 8pm, make sure to fill in the keyword question, and be sure to get checked-off next week at the beginning of lab.
16 changes: 8 additions & 8 deletions Snapchat Camera Lab.xcodeproj/project.pbxproj
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
4196F2CB1E6509E8009916E4 /* AppDelegate.swift in Sources */ = {isa = PBXBuildFile; fileRef = 4196F2CA1E6509E8009916E4 /* AppDelegate.swift */; };
4196F2D01E6509E8009916E4 /* Main.storyboard in Resources */ = {isa = PBXBuildFile; fileRef = 4196F2CE1E6509E8009916E4 /* Main.storyboard */; };
4196F2DD1E650ECA009916E4 /* ImageFeed.swift in Sources */ = {isa = PBXBuildFile; fileRef = 4196F2DC1E650ECA009916E4 /* ImageFeed.swift */; };
4196F2E31E6514EE009916E4 /* ImagePickerController.swift in Sources */ = {isa = PBXBuildFile; fileRef = 4196F2E21E6514EE009916E4 /* ImagePickerController.swift */; };
4196F2E31E6514EE009916E4 /* CameraController.swift in Sources */ = {isa = PBXBuildFile; fileRef = 4196F2E21E6514EE009916E4 /* CameraController.swift */; };
4578E0701FAE8980009838AB /* GoogleService-Info.plist in Resources */ = {isa = PBXBuildFile; fileRef = 4578E06F1FAE8980009838AB /* GoogleService-Info.plist */; };
6C53C7661E723D77004D15B6 /* PostsTableViewController.swift in Sources */ = {isa = PBXBuildFile; fileRef = 6C53C7651E723D77004D15B6 /* PostsTableViewController.swift */; };
6C53C7681E723D87004D15B6 /* PostsTableViewCell.swift in Sources */ = {isa = PBXBuildFile; fileRef = 6C53C7671E723D87004D15B6 /* PostsTableViewCell.swift */; };
Expand All @@ -37,7 +37,7 @@
4196F2CF1E6509E8009916E4 /* Base */ = {isa = PBXFileReference; lastKnownFileType = file.storyboard; name = Base; path = Base.lproj/Main.storyboard; sourceTree = "<group>"; };
4196F2D61E6509E8009916E4 /* Info.plist */ = {isa = PBXFileReference; lastKnownFileType = text.plist.xml; path = Info.plist; sourceTree = "<group>"; };
4196F2DC1E650ECA009916E4 /* ImageFeed.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = ImageFeed.swift; sourceTree = "<group>"; };
4196F2E21E6514EE009916E4 /* ImagePickerController.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; name = ImagePickerController.swift; path = "Snapchat Camera/ImagePickerController.swift"; sourceTree = SOURCE_ROOT; };
4196F2E21E6514EE009916E4 /* CameraController.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; name = CameraController.swift; path = "Snapchat Camera/CameraController.swift"; sourceTree = SOURCE_ROOT; };
4578E06F1FAE8980009838AB /* GoogleService-Info.plist */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.plist.xml; path = "GoogleService-Info.plist"; sourceTree = "<group>"; };
6C53C7651E723D77004D15B6 /* PostsTableViewController.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = PostsTableViewController.swift; sourceTree = "<group>"; };
6C53C7671E723D87004D15B6 /* PostsTableViewCell.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = PostsTableViewCell.swift; sourceTree = "<group>"; };
Expand Down Expand Up @@ -67,6 +67,7 @@
417986CC1E6656930050E1DF /* App Utilities */ = {
isa = PBXGroup;
children = (
4578E06F1FAE8980009838AB /* GoogleService-Info.plist */,
417986CA1E664CB60050E1DF /* ImageFlowLayout.swift */,
4196F2CA1E6509E8009916E4 /* AppDelegate.swift */,
);
Expand All @@ -83,12 +84,12 @@
name = Model;
sourceTree = "<group>";
};
417986CE1E6657650050E1DF /* Image Picker */ = {
417986CE1E6657650050E1DF /* Camera */ = {
isa = PBXGroup;
children = (
4196F2E21E6514EE009916E4 /* ImagePickerController.swift */,
4196F2E21E6514EE009916E4 /* CameraController.swift */,
);
name = "Image Picker";
name = Camera;
sourceTree = "<group>";
};
417986CF1E6657910050E1DF /* Tab Bar */ = {
Expand Down Expand Up @@ -121,14 +122,13 @@
4196F2C91E6509E8009916E4 /* Snapchat Camera */ = {
isa = PBXGroup;
children = (
4578E06F1FAE8980009838AB /* GoogleService-Info.plist */,
942472491F9106A1007257E7 /* Authentication */,
942472481F910694007257E7 /* Constants */,
417986CD1E66571B0050E1DF /* Model */,
417986CC1E6656930050E1DF /* App Utilities */,
4196F2CE1E6509E8009916E4 /* Main.storyboard */,
6C71409E1E70C70F00D38EAD /* Choose Thread */,
417986CE1E6657650050E1DF /* Image Picker */,
417986CE1E6657650050E1DF /* Camera */,
417986CF1E6657910050E1DF /* Tab Bar */,
417986C21E6631020050E1DF /* Assets.xcassets */,
6C53C7641E723D51004D15B6 /* Posts */,
Expand Down Expand Up @@ -315,7 +315,7 @@
files = (
417986CB1E664CB60050E1DF /* ImageFlowLayout.swift in Sources */,
9424724B1F9106E4007257E7 /* Strings.swift in Sources */,
4196F2E31E6514EE009916E4 /* ImagePickerController.swift in Sources */,
4196F2E31E6514EE009916E4 /* CameraController.swift in Sources */,
417986C51E6632C40050E1DF /* TabBarController.swift in Sources */,
6CB38BE31E748E31000E3E31 /* Post.swift in Sources */,
9424724D1F910B3A007257E7 /* LogInViewController.swift in Sources */,
Expand Down
Loading

0 comments on commit ee8c579

Please sign in to comment.