Skip to content

Commit

Permalink
Add '1165/' from commit 'a20b2578773387e7e19f14b3fe6216714cada0c8'
Browse files Browse the repository at this point in the history
git-subtree-dir: 1165
git-subtree-mainline: 7272fbc
git-subtree-split: a20b257
  • Loading branch information
lvn committed Jan 3, 2017
2 parents 7272fbc + a20b257 commit b6a4782
Show file tree
Hide file tree
Showing 21 changed files with 2,261 additions and 0 deletions.
9 changes: 9 additions & 0 deletions 1165/README.md
@@ -0,0 +1,9 @@
# 1151notes
### by [Elvin Yung](https://github.com/elvinyung)

Notes for my 1165 term (i.e. Spring 2016) at the University of Waterloo.

In descending order of how likely I am to attend lectures for the course:
* [CS 341](cs341) - Algorithms
* [CS 349](cs349) - User Interfaces
* [CS 350](cs350) - Operating Systems
90 changes: 90 additions & 0 deletions 1165/cs349/1-1.md
@@ -0,0 +1,90 @@
# Course Introduction

CS 349 - User interfaces, LEC 001

Elvin Yung

[Slides](https://www.student.cs.uwaterloo.ca/~cs349/s16/slides/1.1-introduction.pdf)

## What is a User Interface?
Some definitions that might work:
* how humans see the computer
* where humans and computers meet

A real definition:
* A *user interface* is where a a person can express *intention* to the device, and the device can present *feedback*.

UIs don't just refer to how you interact with computers: Microwaves, refrigerators, door bells, hammers, jets...

## A Brief History of Computer UIs
### Pre 1970s
* Computers had *batch* interfaces.
* They were rudimentary and mostly non-interactive.
* The hot new computers of the day: ENIAC,
* Giving a computer instructions involved [punching holes in cards](https://en.wikipedia.org/wiki/Punched_card)...

![The Punch Bowl](https://upload.wikimedia.org/wikipedia/commons/5/58/FortranCardPROJ039.agr.jpg)

### 1970s - early 1980s
* We got *conversational* interfaces, i.e. command lines.
* We mainly saw them in two different places: *microcomputers* (the first personal computers), and *terminals* ("dumb" video display clients connected to mainframes).

![Apple II](https://upload.wikimedia.org/wikipedia/commons/8/82/Apple_II_tranparent_800.png)

![IBM 3278](http://www.corestore.org/3278-3.jpg)

* Some kid named Bill Gates bought a [Quick and Dirty Operating System](https://en.wikipedia.org/wiki/DOS) to use in IBM PCs (and IBM clones), for just $50,000...

![IBM PC](https://upload.wikimedia.org/wikipedia/commons/f/f1/Ibm_pc_5150.jpg)

### late 1980s - now?
* Xerox's Palo Alto Research Park (PARC) developed amazing technologies around this time, things like Ethernet networking and object-oriented programming.
* The one that got the most attention was the bitmapped display. It let you show completely *graphical* user interfaces in your software, controlled with a device called a *mouse*.
* Xerox was too focused on their photocopier products, and never really capitalized on their innovations. They made some very expensive workstations based on the GUI and Ethernet, the Alto (1973) and the Star (1981), which never really sold well.
* In exchange for a small stake in Apple, Xerox let Steve Jobs [visit](https://www.youtube.com/watch?v=2u70CgBr-OI) PARC. ["Xerox grabbed defeat from the greatest victory in the computer industry."](https://www.youtube.com/watch?v=_1rXqD6M614)
* Apple ended up using the technology in the Lisa (which was a huge failure), and then the [Macintosh](http://www.folklore.org/ProjectView.py?project=Macintosh) (which started off a hit, but ended up also a huge failure).

![It sure is great to get out of that bag.](http://radio-weblogs.com/0102482/images/2005/06/06/hello-mac.jpg)

* Some [other company](https://www.youtube.com/watch?v=sforhbLiwLA) that [has no taste](https://www.youtube.com/watch?v=EJWWtV1w5fw) took the ideas for the GUI and [ran with it](https://en.wikipedia.org/wiki/Windows_1.0). And [bad things](https://en.wikipedia.org/wiki/Apple_Computer,_Inc._v._Microsoft_Corp.) happened.

![They just have no taste.](http://zdnet3.cbsistatic.com/hub/i/r/2015/07/23/db9b07b8-1bd3-4451-9365-7bd336f4d7dd/resize/1170x878/6a5511eafc6e9a454add33945466f8ed/cmwindows1-0jul15a.jpg)

### 1990s -
* 1989: Tim Berners-Lee came up with a hypertext format to be sent over network connection. This has made a lot of people very angry and been widely regarded as a bad move. Tim decided to call his neat invention the WorldWideWeb.
* 1993: Marc Andreesen makes a web browser called Mosaic. It add graphics to web pages. This has made a lot of people very angry and been widely regarded as a bad move. Mosaic eventually became Netscape.
* People started putting `.com` at the end of their company name. It was a weird time.

### now? - future?
* Touchscreens

![No more Eat Up Martha.](https://i.ytimg.com/vi/e7EfxMOElBE/maxresdefault.jpg)

* Voice

![Echo echo](http://gazettereview.com/wp-content/uploads/2015/12/Amazon-Echo-1.jpg)


### How Has Computing Changed?
* The introduction of the GUI *fundamentally* changed how we use technology.
* Computers went from being a specialist tool to being used by everyone, without having to be an expert or build their own software.

## Interactive System Architecture
* The user has a *mental model* - how it thinks the device works.
* The device has a *system model* - how it actually works.
* Interaction: the user expresses an intention to the device, and the device presents feedback about that intention.
* An *event*, to the user, is an observable occurrence or phenomenon. To the system, it's a message saying that something happened.

## Interface vs. interaction
* *Interface* is how the device presents itself to the user. These include controls, and visual, physical, and auditory cues.
* *Interaction* is the user's actions to perform a task, and how the device responds.

## Designing Interactions
* Designing good interaction is hard because users, and the things they want to do, are all different.
* Can you anticipate all scenarios?
* There's no single right way to build an interface - it can always be improved.

## Why Study Interaction?
* The right computer is a [bicycle for the mind](https://www.youtube.com/watch?v=ob_GX50Za6c&t=25s).
* A well designed tool with a good interface can radically improve our productivity and let us do things that we couldn't dream of before.
* New technology becomes widespread not when it becomes more powerful, but when it becomes easy to use.
86 changes: 86 additions & 0 deletions 1165/cs349/10-1.md
@@ -0,0 +1,86 @@
# Wearables

CS 349 - User interfaces, LEC 001

7-4-2016

Elvin Yung

[Slides](https://www.student.cs.uwaterloo.ca/~cs349/s16/slides/10.1-wearable_computing.pdf)

## Smartwatches
![](https://xkcd.com/1420/)

* We're deliberately not going to talk about things like Fitbits and Pebbles, because they're more specialized.
* We'll focus on the Apple Watch and Android Wear, which are generalized.

### Design Challenges
* The first issue is that these things are tiny! You're constrained by the user's hand.
* Things like the fat finger problem are much worse.
* Physical buttons are important - you need buttons because there's no practical way a touchscreen works.
* Limited attention
* A smartwatch is not intended to be the device of choice for complicated use cases - you're not going to be manipulating spreadsheets
* Instead, smartwatches are for quick tasks on the go - things you want to
be able to do without having to pull out their phone.

#### Guidelines from Google
* The watch is mostly an output device, not an input device.
* Google suggests all computation to be done on the phone, and be sent to the watch - in other words, the watch should just be a dumb terminal.
* The entire task on the watch should take <5 seconds - if it takes more, a watch is not the right device.
* The watch is *secondary* - it's only auxiliary to the phone, designed for quick interactions.

#### Guidelines from Apple
* Apple emphasizes personal communication on the Apple Watch. They emphasize initiating communicating, but it's not a very compelling use case.
* There are dedicated apps (but no one uses them).
* Interaction mostly via gestures, but there's also force touch, the "crown" dial, and the side buttons.
* Emphasize coordination with smartphone - should be able to tap to answer a call from the watch, and then the control is transferred to the phone.

### The Big Question
*Why doesn't everyone have a smartwatch?*

* No "killer app" or other compelling use cases
* Probably not good enough as a proxy for phone
* Fitness tracking isn't sufficient for most people
* Healthcare, monitoring blood pressure, heart rate, etc. - maybe?
* Identification - Apple Pay, Android Pay, computer authentication etc. - maybe eventually replace passwords
* Price
* Battery sucks
* etc.

### Utilitarian vs. Fashionable Devices
* Is a smartwatch a piece of jewelry or a utility device?

## Ubiquitous Computing
* Introduced by Mark Weiser, 1996
* Basically a very old term for Internet of Things
* Instead of having discrete devices that you carry, instrument the world around you to do things for you.
* For Ubicomp to really work, you need:
* Computation embedded into the environment
* Something that ties the person to the environment - a device that helps identify the person. Can a smartwatch fill this role? Maybe.

## Augmented Reality
* Examples: Google Glass, Hololens

### Design Principles
* Don't get in the way of what the user is doing
* Only give information that's relevant to what the user is currently doing. Don't always put the temperature in the corner!
* Avoid showing things

### Results
Google Glass didn't get wide adoption. What happened?

* Technology was not super feasible - 2 hour battery life?
* Principles of Ubicomp
* Google Glass was considered rude or awkward - [Glassholes](https://nypost.com/2014/07/14/is-google-glass-cool-or-just-plain-creepy/)
* There were cameras mounted on them, and when someone is walking around with Google Glass on, there's no indication that they're not recording you
* Is Glass a fashion device? Google tried to make it like that, but

AR definitely still has potential, though!

## More Generally for Wearables
And also other new technology.

* Why do you need a wearable?

* A better mousetrap is not good enough - it needs to be solving a problem - 10x not 10%.
* New technology takes time to mature! Remember old tablets and PDAs?
105 changes: 105 additions & 0 deletions 1165/cs349/10-2.md
@@ -0,0 +1,105 @@
# Input

CS 349 - User interfaces, LEC 001

7-6-2016

Elvin Yung

[Slides](https://www.student.cs.uwaterloo.ca/~cs349/s16/slides/10.2-input.pdf)

* The iPod was the perfect input method for a device where most of the UI elements were list-based.
* But it's not

## Classifying Computer Input
* Sensing method
* Mechanical - switch, potentiometer
* Motion - accelerometer, gyroscope
* Contact - capacitive touch, pressure sensor
* Signal processing
* Continuous vs discrete
* There are different input devices for different purposes, but we mostly use the mouse and the keyboard.

## Text Input
### QWERTY
* The QWERTY keyboard layout was first introduced in the Remington Model I typewriter in 1873.
* They were trying to design a keyboard that wouldn't jam, which happened when you pressed two adjacent keys at once.
* So the intention was to space out the key presses, so that the user would alternate between left and write hands in typing.
* So of course, when we added keyboards to computers, we stole this layout from typewriters, because that's what people were already used to.

* The optimal way to use a QWERTY keyboard is to keep your hands on home row, and moving your fingers to move
* Except it doesn't actually work that well:
* Awkward key combinations, like `tr`,
* Sometimes have to jump over the home row, e.g. `br``
*
* Because of letter frequency, most of the typing is actually done with the left hand. Because most people are right-handed, this can slow people down.
* Statistics on key presses:
* 16% on lower row
* 52% on top row
* 32% on the home row

#### Other layouts
* Since QWERTY has so many issues, there are a few remapped layouts.

* Example: Dvorak
* Letters should be typed by alternating between hands
* 70% of letters are on home row
* Bias towards right-handed typing, since most people are right-handed

* **Studies are inconclusive on whether there's any actual productivity difference when using a non-QWERTY keyboard layout.**
* An interesting point: it's really useful to be able to sit down on any computer and be able to

### Mechanical Keyboards
* If the keys are downsized (e.g. on a BlackBerry), it interferes with typing.

### Soft Keyboards
* on touchscreens, etc.
* You no longer get any sort of tactile feedback. You have to either get really good at touch typing, or hope that autocomplete works well enough.
* We're basically trading a physical keyboard to get a bigger screen.
* Soft keyboards are good on devices where you don't have to do a lot of typing. e.g. an iPad can be used mostly as a movie watching device

### Other variants
* Thumb keyboards - so that you could hold on the device and type reasonably well with just your thumbs
* Frogpad - one-handed keyboard, only 15 keys, plus some meta keys, and different combinations of meta keys lets you type different letters
* Chording keyboards - Douglas Engelbart proposed this - basically, a keyboard that only has 5 keys, and you type different combinations of keys.
* Successor: [the Twiddler](http://twiddler.tekgear.com/)

### Predictive Text Input
* T9
* Autocomplete/autocorrect

### Others
* Palm Pilot's [Graffiti](https://en.wikipedia.org/wiki/Graffiti_(Palm_OS)) - had decent accuracy, but you needed to memorize this entire scheme.
* Natural handwriting recognition, e.g.
* [ShapeWriter](https://en.wikipedia.org/wiki/ShapeWriter) - original inspiration for Swype, let people type on a touchscreen without lifting their finger
* IJQwerty - study that found people were much more productive when i and j were swapped on ShapeWrite
* [8pen](http://www.8pen.com/) - enter words by drawing loops
* Seems like it'd be error prone, but

## Positional Input
* Ur-example: Etch-A-Sketch

### Properties
#### Sensing
* Force or **isometric**: Input data is in the form of direction and magnitude of force, e.g. joystick
* Displacement or **isotonic**: Input data is in the form of position difference, e.g. mouse

#### Position vs Rate Control
* Rate: joystick
* Position: mouse

#### Absolute vs Relative
* This describes how the input device is mapped to the display.
* **Absolute**: where you touch is directly mapped onto the display.
* Example: A drawing tablet
* Normally, however, on a desktop we use **relative** input.
* Example: moving the mouse moves the cursor proportionally, but doesn't teleport it to some absolute location.

#### Control-Display Gain
* The **gain** is the ratio of how fast the pointer moves to how fast the input device moves.

* If the CD gain is 1, when the input device moves some distance, the pointer moves the same distance.
* If the CD gain is less than 1, the pointer moves more slowly than the device.
* If the CD gain is more than 1, the pointer moves faster than the device.

* In lots of OSes this is also known as the **sensitivity**, and it's generally tunable in the settings.

0 comments on commit b6a4782

Please sign in to comment.