Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Browse files

Add more days of notes.

  • Loading branch information...
commit 17a7744d88946ab2ad0ad69e6217cec45791fa7d 1 parent f839a8a
@steveWang authored
View
146 cs150.html
@@ -594,7 +594,151 @@
sums. (Section 2.7). Based on the combining theorem, which says that <mathjax>$XA +
X\bar{A} = X$</mathjax>. Ideally: every row should just have a single value
changing. So, I use Gray codes. (e.g. 00, 01, 11, 10). Graphical
-representation!</p></div><div class='pos'></div>
+representation!</p>
+<p><a name='9'></a></p>
+<h1>CS 150: Digital Design &amp; Computer Architecture</h1>
+<h2>September 18, 2012</h2>
+<p>Lab this week you are learning about chipscope. Chipscope is kinda like
+what it sounds: allows you to monitor things happening in the FPGA. One of
+the interesting things about Chipscope is that it's a FSM monitoring stuff
+in your FPGA, it also gets compiled down, and it changes the location of
+everything that goes into your chip. It can actually make your bug go away
+(e.g. timing bugs).</p>
+<p>So. Counters. How do counters work? If I've got a 4-bit counter and I'm
+counting from 0, what's going on here?</p>
+<p>D-ff with an inverter and enable line? This is a T-ff (toggle
+flipflop). That'll get me my first bit, but my second bit is slower. <mathjax>$Q_1$</mathjax>
+wants to toggle only when <mathjax>$Q_0$</mathjax> is 1. With subsequent bits, they want to
+toggle when all lower bits are 1.</p>
+<p>Counter with en: enable is tied to the toggle of the first bit. Counter
+with ld: four input bits, four output bits. Clock. Load. Then we're going
+to want to do a counter with ld, en, rst. Put in logic, etc.</p>
+<p>Quite common: ripple carry out (RCO), where we AND <mathjax>$Q[3:0]$</mathjax> and feed this
+into the enable of <mathjax>$T_4$</mathjax>.</p>
+<p>Ring counter (shift register with one hot out), If reset is low I just
+shift this thing around and make a circular shift register. If high, I clear
+the out bit.</p>
+<p>Mobius counter: just a ring counter with a feedback inverter in it. Just
+going to take whatever state in there, and after n clock ticks, it inverts
+itself. So you have <mathjax>$n$</mathjax> flipflops, and you get <mathjax>$2n$</mathjax> states.</p>
+<p>And then you've got LFSRs (linear feedback shift registers). Given N
+flipflops, we know that a straight up or down counter will give us <mathjax>$2^N$</mathjax>
+states. Turns out that an LFSR give syou almost that (not 0). So why do
+that instead of an up-counter? This can give you a PRNG. Fun times with
+Galois fields.</p>
+<p>Various uses, seeds, high enough periods (Mersenne twisters are higher).</p>
+<h2>RAM</h2>
+<p>Remember, decoder, cell array, <mathjax>$2^n$</mathjax> rows, <mathjax>$2^n$</mathjax> word lines, some number of
+bit lines coming out of that cell array for I/O with output-enable and
+write-enable.</p>
+<p>When output-enable is low, D goes to high-Z. At some point, some external
+device starts driving some Din (not from memory). Then I can apply a write
+pulse (write strobe), which causes our data to be written into the memory
+at this address location. Whatever was driving it releases, so it goes back
+to high-impedance, and if we turn output-enable again, we'll see "Din" from
+the cell array.</p>
+<p>During the write pulse, we need Din stable and address stable. We have a
+pulse because we don't want to break things. Bad things happen.</p>
+<p>Notice: no clock anywhere. Your FPGA (in particular, the block ram on the
+ML505) is a little different in that it has registered input (addr &amp;
+data). First off, very configurable. All sorts of ways you can set this up,
+etc. Addr in particular goes into a register and comes out of there, and
+then goes into a decoder before it goes into the cell array, and what comes
+out of that cell array is a little bit different also in that there's a
+data-in line that goes into a register and some data-out as well that's
+separate and can be configured in a whole bunch of different ways so that
+you can do a bunch of different things.</p>
+<p>The important thing is that you can apply your address to those inputs, and
+it doesn't show up until the rising edge of the clock. There's the option
+of having either registered or non-registered output (non-registered for
+this lab).</p>
+<p>So now we've got an ALU and RAM. And so we can build some simple
+datapaths. For sure you're going to see on the final (and most likely the
+midterm) problems like "given a 16-bit ALU and a 1024x16 sync SRAM, design
+a system to find the largest unsigned int in the SRAM."</p>
+<p>Demonstration of clock cycles, etc. So what's our FSM look like? Either
+LOAD or HOLD.</p>
+<p>On homework, did not say sync SRAM. Will probably change.</p>
+<p><a name='10'></a></p>
+<h1>CS 150: Digital Design &amp; Computer Architecture</h1>
+<h2>September 20, 2012</h2>
+<p>Non-overlapping clocks. n-phase means that you've got n different outputs,
+and at most one high at any time. Guaranteed dead time between when one
+goes low and next goes high.</p>
+<h2>K-maps</h2>
+<p>Finding minimal sum-of-products and product-of-sums expressions for
+functions. <strong>On-set</strong>: all the ones of a function; <strong>implicant</strong>: one or
+more circled ones in the onset; a <strong>minterm</strong> is the smallest implicant you
+can have, and they go up by powers of two in the number of things you can
+have; a <strong>prime implicant</strong> can't be combined with another (by circling);
+an <strong>essential prime implicant</strong> is a prime implicant that contains at
+least one one not in any other prime implicant. A <strong>cover</strong> is any
+collection of implicants that contains all of the ones in the on-set, and a
+<strong>minimal cover</strong> is one made up of essential prime implicants and the
+minimum number of implicants.</p>
+<p>Hazards vs. glitches. Glitches are when timing issues result in dips (or
+spikes) in the output; hazards are if they might happen. Completely
+irrelevant in synchronous logic.</p>
+<h2>Project</h2>
+<p>3-stage pipeline MIPS150 processor. Serial port, graphics accelerator. If
+we look at the datapath elements, the storage elements, you've got your
+program counter, your instruction memory, register file, and data
+memory. Figure 7.1 from the book. If you mix that in with figure 8.28,
+which talks about MMIO, that data memory, there's an address and data bus
+that this is hooked up to, and if you want to talk to a serial port on a
+MIPS processor (or an ARM processor, or something like that), you don't
+address a particular port (not like x86). Most ports are
+memory-mapped. Actually got a MMIO module that is also hooked up to the
+address and data bus. For some range of addresses, it's the one that
+handles reads and writes.</p>
+<p>You've got a handful of different modules down here such as a UART receive
+module and a UART transmit module. In your project, you'll have your
+personal computer that has a serial port on it, and that will be hooked up
+to your project, which contains the MIPS150 processor. Somehow, you've got
+to be able to handle characters transmitted in each direction.</p>
+<h2>UART</h2>
+<p>Common ground, TX on one side connected to RX port on other side, and vice
+versa. Whole bunch more in different connectors. Basic protocol is called
+RS232, common (people often refer to it by connector name: DB9 (rarely
+DB25); fortunately, we've moved away from this world and use USB. We'll
+talk about these other protocols later, some sync, some async. Workhorse
+for long time, still all over the place.</p>
+<p>You're going to build the UART receiver/transmitter and MMIO module that
+interfaces them. See when something's coming in from software /
+hardware. Going to start out with polling; we will implement interrupts
+later on in the project (for timing and serial IO on the MIPS
+processor). That's really the hardcore place where software and hardware
+meet. People who understand how each interface works and how to use those
+optimally together are valuable and rare people.</p>
+<p>What you're doing in Lab 4, there's really two concepts of (1) how does
+serial / UART work and (2) ready / valid handshake.</p>
+<p>On the MIPS side, you've got some addresses. Anything that starts with FFFF
+is part of the memory-mapped region. In particular, the first four are
+mapped to the UART: they are RX control, RX data, TX control, and TX data.</p>
+<p>When you want to send something out the UART, you write the byte -- there's
+just one bit for the control and one byte for data.</p>
+<p>Data goes into some FSM system, and you've got an RX shift register and a
+TX shift register.</p>
+<p>There's one other piece of this, which is that inside of here, the thing
+interfacing to this IO-mapped module uses this ready bit. If you have two
+modules: a source and a sink (diagram from the document), the source has
+some data that is sending out, tells the sink when the data is valid, and
+the sink tells the source when it is ready. And there's a shared "clock"
+(baud rate), and this is a synchronous interface.</p>
+<ul>
+<li>source presents data</li>
+<li>source raises valid</li>
+<li>when ready &amp; valid on posedge clock, both sides know the transaction was
+ successful.</li>
+</ul>
+<p>Whatever order this happens in, source is responsible for making sure data
+is valid.</p>
+<p>HDLC? Takes bytes and puts into packets, ACKs, etc.</p>
+<p>Talk about quartz crystals, resonators. <mathjax>$\pi \cdot 10^7$</mathjax>.</p>
+<p>So: before I let you go, parallel load, n bits in, serial out, etc.</p>
+<p><a name='11'></a></p>
+<h1>CS 150: Digital Design &amp; Computer Architecture</h1>
+<h2>September 25, 2012</h2></div><div class='pos'></div>
<script src='mathjax/unpacked/MathJax.js?config=default'></script>
<script type="text/x-mathjax-config">
MathJax.Hub.Register.StartupHook("TeX Jax Ready",function () {
View
338 cs_h195.html
@@ -382,7 +382,343 @@
<p><a name='3'></a></p>
<h1>CS H195: Ethics with Harvey</h1>
<h2>September 17, 2012</h2>
-<p>Lawsuit to get records about NSA's surveillance information.</p></div><div class='pos'></div>
+<p>Lawsuit to get records about NSA's surveillance information.</p>
+<p>Video games affecting people, evidently.</p>
+<p>Government subpoenaed Twitter to give people tweets.</p>
+<p>Records can be subpoenaed in a court case, etc. We'll see how this plays
+out. Today, in today's Daily Cal, UCB suing big companies. Universities do
+research, etc. Back in the day, core memory meant people paid money to IBM
+and MIT. Berkeley holds a bunch of patents. Non-software seems reasonable.</p>
+<p>Important point: the burst of genius is very rarely true. Enabling
+technologies have reached the point of making things feasible. Usual story
+about inventions. Flash bulb in a camera, single-use: before sustainable
+light bulb. Steam engine. Some inventions aren't like that. Some really do
+just come to somebody (velcro, xerography). Nobody else was working on
+that. More often, everyone is thinking about this stuff.</p>
+<p>IP. A patent is the right to develop an invention, to produce things
+dependent on an invention. Copyright is not about invention, it's about
+creative and artistic works. And there, if you have an idea and write about
+it, other people are allowed to use your ideas, not your words. Trademark,
+you know what it is; you can register one; people are not allowed to use it
+in ways that might confuse people. You can in principle make a vacuum
+cleaner called "time". How close do things have to be to raise a lawsuit?
+Lawsuit about Apple Computers vs Apple Records. Later did, which caused a
+later round of battling.</p>
+<p>Personal likeness, I can't take a picture of you and publish it with
+certain exceptions. Most important for famous people. Funny rules:
+newsworthy, and news photographers are allowed to take pictures of
+newsworthy people.</p>
+<p>Trade secrets: if a company has secrets, and you are a competing company,
+you may not send a spy to extract these secrets.</p>
+<p>House ownership. There are houses where people have had houses for
+millennia. Patents and copyrights are not like that: not a right. Those
+things are bargains between creators and society. Purpose to society is
+that these eventually belong to the public. One of the readings talks about
+a different history of patents quoting Italian legal scholars, and if
+correct, patents were supposed to be permanent ownership. Why might it be
+good to society? Used to be people who made new inventions. Guilds. Hard to
+join, and you would be a slave for a while. Master would teach apprentice
+the trade, and advantage was that it reduced competition. Trouble was that
+there is a long history of things people used to be able to do that we
+can't anymore. Textbook example: Stradivarius violins.</p>
+<p>Nonetheless, nobody knows how Stradivarius made violins. Stories about how
+to make paints of particular colors. What the patent system is trying to
+avoid. Describe how invention works so someone in the field can create
+it. By making this disclosure, you are given a limited-term exclusive right
+to make these.</p>
+<p>The thing is, sooner or later, your technology is going to be obsolete. To
+your advantage to have a clear legal statement.</p>
+<p>Patent treaties. Used to be that if you invented something important, you'd
+hire a bunch of lawyers.</p>
+<p>Until recently, software was not patentable. ATT wanted to patent the
+SETUID bit. In those days, you could not patent any math or software or
+algorithm.</p>
+<p>Patents stifling innovation in the field. When you file a patent
+application. Let's say you deny the patent. You would like to fall back on
+trade secrecy. Patent applications are secret until approved. Startups
+doomed. Wouldn't matter if term were short compared to innovation cycle of
+the industry.</p>
+<p>Another thing in the Constitution is that treaties take precedence over
+domestic laws.</p>
+<p>So let's talk about copyrights! So. Nobody says let's do away with
+copyright altogether. Copyright (at its worst) is less socially harmful
+than patents because it's so specific. Again, copyrights are a
+bargain. Started in Britain between the King and printers. Printers wanted
+exclusive right to things they printed. King wanted printers to be
+censors. Originally not authors who had copyright, but the publisher. Often
+creators of rights will sell the rights to publishers.</p>
+<p>This is where computers come in. How to sell to world? Used to need big
+company with facilities to create copies and widely
+distribute. Self-publish: work available to everyone. Important: rarely
+author who complains about copyrights. Usually publishers.</p>
+<p>There's always been piracy, but limited historically by analog media losing
+information when copying.</p>
+<p>Term of copyright has gotten longer and longer. Lawsuit about this about
+the most recent extension. In effect, making permanent copyright, against
+constitution. Ironic because copyright law now would have made much of
+what made Disney rich would have been copyrighted. Lot of exceptions to
+copyright law. Fair use. e.g. cannot write a Harry Potter novel, but can
+write a Harry Potter parody. Famous case: Gone with the Wind. About how
+wonderful life was for the owners of slaves. Someone wrote a book
+(retelling from slave's point of view); ruled as fair use (political
+commentary, protected by free speech).</p>
+<p>Stallman actually invented a system that has 5 different categories of
+work. Even Stallman doesn't say to ditch copyright. Hardly any musicians
+make any money selling music because their contracts say that they make a
+certain percentage of net proceeds. The way musicians survive is concerts,
+and ironically, selling concert CDs. Stallman says to make music players
+have a money button and send money directly to the musician.</p>
+<p><a name='4'></a></p>
+<h1>CS H195: Ethics with Harvey</h1>
+<h2>September 24, 2012</h2>
+<p>Vastly oversimplified picture of moral philosophy. Leaves out a lot.</p>
+<p>So Socrates says famously "to know the good is to desire the good", by
+which he means that if you really understand what's in your own interest,
+it's going to turn out to be the right thing. Counter-intuitive, since
+we've probably encountered situations in which we think what's good for us
+isn't good for the rest of the community.</p>
+<p>Ended up convicting Socrates, and he was offered the choice between exile
+from Athens and death -- chose death because he felt that he could not
+exist outside of his own community. His most famous student was Plato, who
+started an Academy (Socrates just wandered around from hand to mouth), took
+in students (one of whom was Aristotle). If you're scientists or engineers,
+you've been taught to make fun of Aristotle, since he said that heavier
+objects fall faster than light objects, and famously, Galileo took two
+objects, dropped them, and they hit the ground at the same time.</p>
+<p>It's true that some of the things Aristotle said about the physical world
+have turned out not to be right. But it's important to understand it in
+terms of the physical world, he did not have the modern idea of trying to
+make a universal theory that explained everything.</p>
+<p>Objects falling in atmosphere with friction different from behavior of
+planets orbiting sun? Perfectly fine with Aristotle.</p>
+<p>One of the things Aristotle knew? When you see a plate of donuts, you know
+perfectly well that it's just carbs and fat and you shouldn't eat them, but
+you do anyway. Socrates explains that as "you don't really know through and
+through that it is bad for you", and Aristotle doesn't like that
+explanation. Knowing what to do and actually doing it are two different
+things. Took that in two directions: action syllogism (transitivity),
+extended so that conclusion of the syllogism can be an action. Not
+important to us: important to us is that he introduces the idea of
+virtues. A virtue is not an understanding of what's right, but a habit --
+like a good habit you get into.</p>
+<p>Aristotle lists a bunch of virtues, and in all cases he describes it as a
+midpoint between two extremes (e.g. courage between cowardice and
+foolhardiness, or honesty as a middle ground between dishonesty and saying
+too much).</p>
+<p>Better have good habits, since you don't have time in real crises to
+think. So Aristotle's big on habits. And he says that you learn the virtues
+through being a member of a community and through the role you play in that
+community, Lived in a time that people inherited roles a lot. The argument
+goes a little like this. What does it mean to be a good person? Hard
+question. What does it mean to be a good carpenter? Much easier. A good
+carpenter builds stuff that holds together and looks nice, etc. What are
+the virtues that lead to being a good carpenter? Also easy: patience, care,
+measurement, honesty, etc. Much easier than what's a good
+person.</p>
+<p>Aristotle's going to say that the virtues of being a good person are
+precisely the virtues you learn in social practices from people older than
+you who are masters of the practice. One remnant of that in modern society
+is martial arts instruction. When you go to a martial arts school and say
+you want to learn, one of the first things you learn is respect for your
+instructor, and you're supposed to live your life in a disciplined way, and
+you're not learning skills so much as habits. Like what Aristotle'd say
+about any practice. Not so much of that today: when you're learning to be a
+computer scientist, there isn't a lot of instruction in "here are the
+habits that make you a (morally) good computer scientist".</p>
+<p>Kant was not a communitarian: was more of "we can figure out the right
+answer to ethical dilemmas." He has an axiom system, just like in
+mathematics: with small numbers of axioms, you can prove things. Claims
+just one axiom, which he describes in multiple ways.</p>
+<p>Categorical imperative number one: treat people as ends, not means. This is
+the grown-up version of the golden rule. Contracts are all right as long as
+both parties have their needs met and exchange is not too unequal.</p>
+<p>Second version: universalizability. An action is good if it is
+universalizable. That means, if everybody did it, would it work? Textbook
+example is "you shouldn't tell lies". The only reason telling lies works is
+because people usually tell the truth, and so people are predisposed to
+thinking that it's usually true. If everyone told lies, then we'd be
+predisposed to disbelieve statements. Lying would no longer be effective.</p>
+<p>There's a third one which BH can never remember which is much less
+important. Kant goes on to prove theorems to resolve moral dilemmas.</p>
+<p>Problem from Kant: A runs past you into the house. B comes up with a gun
+and asks you where A is. Kant suggests something along the lines of
+misleading B.</p>
+<p>Axiomatic, resolve ethical problems through logic and proving what you want
+to do. Very popular among engineers, mainly for the work of Rawls, who
+talks about the veil of ignorance. You have to imagine yourself, looking at
+life on Earth, and not knowing in what social role you're going to be
+born. Rawls thinks that from this perspective, you have to root for the
+underdog when situations come up, because in any particular thing that
+comes up, harm to the rich person is going to be less than the gains of the
+poor person (in terms of total wealth, total needs). Going to worry about
+being on side of underdog, etc. More to Rawls: taking into account how
+things affect all different constituencies.</p>
+<p>Another descendant of Plato are utilitarians. One of the reasons it's
+important for you to understand this chart: when you don't think about it
+too hard, you use utilitarian principles, which is sometimes
+bad. Utilitarians talk about the greatest good for the greatest number.</p>
+<p>Back to something from this class: what if I illegally download some movie?
+Is that okay? How much do I benefit, and how much is the movie-maker
+harmed? Not from principled arguments, which is what Kant wants you to do,
+but from nuts and bolts, who benefits how much, each way.</p>
+<p>Putting that in a different fashion, Kantians are interested in what
+motivates your action, why you did it. Utilitarians are interested in the
+result of your action. One thing that makes utilitarian hard is that you
+have to guess as to what probably will happen.</p>
+<p>Now I want to talk to you about MacIntyre. Gave you a lot of reading,
+probably hardest reading in the course. Talks like a philosopher. Uses
+dessert as what you deserve (noun of deserve). Life-changing for BH when he
+came across MacIntyre; passing it on to you as a result.</p>
+<p>He starts by saying to imagine an aftermath in which science is blamed and
+destroyed. A thousand years later, some people digging through the remains
+of our culture read about this word science, and it's all about
+understanding how the physical world works, and they want to revive this
+practice. Dig up books by scientists, read and memorize bits of them,
+analyze, have discussions. The people who do this call themselves
+scientists because they're studying science.</p>
+<p>We from our perspective would say that isn't science at all -- you don't
+just engage with books, but rather engage with the physical world through
+experiments. Those imagined guys from a millennium from now have lost the
+practice. They think they're following a practice, but they have no idea
+what it's like. MacIntyre argues this is us with ethics.</p>
+<p>Equivalent to WW3 according to MacIntyre is Kant. Kant really, more than
+anyone else, brought into being the modern era. Why? Because in the times
+prior to Kant, a lot of arguments not only about ethics but also by the
+physical world were resolved by religious authority. Decisions made based
+on someone's interpretation of the bible, e.g.</p>
+<p>Kant claims to be a Christian, but he thinks the way we understand God's
+will is by applying the categorical imperative. Instead of asking a priest
+what to do, we reason it out. We don't ask authorities, we work it out.
+Also, he starts this business of ethical dilemmas. Everybody in the top
+half of the world talks in terms of the good life. Even Socrates, who
+thinks you can know what to do, talks about the good life, too. So ethics
+is not about "what do I do in this situation right now", but rather the
+entirety of one's life and what it means to live a good life.</p>
+<p>Kant and Mill: no sense of life as a flow; rather, moments of
+decisions. What MacIntyre calls the ethical equivalent of WW3: at that
+point, we lost the thread, since we stopped talking about the good
+life. Now, it wasn't an unmitigated disaster, since it gives us -- the
+modern liberal society, not in the American sense of voting for democrats,
+but in the sense that your life goals are up to you as an individual, and
+the role of society is to build infrastructure and getting in people's way,
+so stopping people from doing things. I can, say, have some sexual practice
+different from yours. So that was a long time coming. Now, in our
+particular culture, the only thing that's bad is having sex with children,
+as far as I can tell -- as long as it doesn't involve you messing up
+someone else's life, e.g. rape. As long as it involves two (or more?)
+consenting adults, that's okay.</p>
+<p>MacIntyre says that there are things that came up with Kant that we can't
+just turn back to being Aristotlean. The people who lived the good life
+were male Athenian citizens. They had wives who weren't eligible, and they
+had slaves who did most of the grunt work. And so male Athenian citizens
+could spend their time walking around chatting with Socrates because they
+were supported by slavery. And nobody wants to go back to that. No real way
+to go back to being Aristotlean without giving up modern civil rights.</p>
+<p>So. One of the things I really like about MacIntyre is the example of
+wanting to teach a child how to play chess, but he's not particularly
+interested. He is, however, interested in candy. You say, every time you
+play with me, I'll give you a piece of candy. If you win, two pieces. Will
+play in a way that's difficult but possible to beat me. So, MacIntyre says
+this child is now motivated to play and to play well. But he's also
+motivated to cheat, if he can get away with it. So let's say this
+arrangement goes on for some time, and the kid gets better at it. What you
+hope is that the child reaches a point where the game is valuable to
+itself: he or she sees playing chess as rewarding (as an intellectual
+challenge). When that happens, cheating becomes self-defeating.</p>
+<p>While the child is motivated by external goods (rewards, money, fame,
+whatever), then the child is not part of the community of practice. But
+once the game becomes important (the internal benefits motivate him), then
+he does feel like part of the community. Huge chess community with
+complicated infrastructure with rating, etc. And that's a community with
+practice, and it has virtues (some of which are unique to chess, but maybe
+not -- e.g. planning ahead). Honesty, of course; patience; personal
+improvement.</p>
+<p>And the same is true with most things that human beings do. Not
+everything. MacIntyre raises the example of advertising. What are the
+virtues of this practice? Well, appealing to people in ways that they don't
+really see; suggesting things that aren't quite true without saying
+them. He lists several virtues that advertising people have, and these
+virtues don't generalize. Not part of being a good person; not even
+compatible with being a good person. So different from virtues of normal
+practices.</p>
+<p>Having advertising writers is one of the ways in which MacIntyre thinks
+we've just lost the thread. The reason we have them is that we hold up in
+our society the value of furthering your own ambition and getting rich, and
+not getting rich by doing something that's good anyway, but just getting
+rich. That's an external motivation rather than an internal one.</p>
+<p>We talk about individuals pursuing their own ends. We glorify -- take as an
+integral part of our society -- as individuals pursuing their own ends. In
+a modern understanding of ethics, you approach each new situation as if
+you've never done anything. You don't learn from experience; you learn from
+rules. The result may be the same for each intermediate situation, but it
+leads to you thinking differently. You don't think about building good
+habits in this context.</p>
+<p>A lot of you probably exercise (unlike me). Maybe you do it because it's
+fun, but maybe you also do it because it only gets harder as you get older,
+and you should get in the habit to keep it up. In that area, you get into
+habits. But writing computer programs, we tell you about rules (don't have
+concurrency violations), and I guess implicitly, we say that taking 61B is
+good for you because you learn to write bigger programs. Still true --
+still a practice with virtues.</p>
+<p>Two things: that sort of professional standard of work is a pretty narrow
+ethical issue. They don't teach you to worry about the privacy implications
+of third parties. Also, when people say they have an ethical dilemma, they
+think about it as a decision. A communitarian would reject all that ethical
+dilemma stuff. Dilemmas will have bad outcomes regardless. Consider Greek
+tragedies. When Oedipus finds himself married to his mother, it's like game
+over. Whole series of bad things that happen to him. Not much he can do
+about it on an incident by incident basis. Problem is a fatal flaw in his
+character early on (as well as some ignorance), and no system of ethics is
+going to lead Oedipus out of this trap. What you have to is try not to get
+into traps, and you do that through prudence and honesty and whatnot.</p>
+<p>Classic dilemma: Heins is a guy whose wife has a fatal disease that can be
+cured by an expensive drug, but Heins is poor. So he goes to the druggist
+and says that he can't afford to pay for this drug, but his wife is going
+to die, so the druggist says no. So Heins is considering breaking into the
+drugstore at night and stealing the drug so his wife can live. What should
+he do and why? According to the literature, there's no right answer. What
+matters is your reason.</p>
+<p>I'm going to get this wrong, but it's something like this. Stage one: your
+immediate needs are what matter. Yes, he should steal it, because it's his
+wife, or no, he shouldn't steal it, because he should go to prison. Stage
+two: something like worrying about consequences to individuals. Might hurt
+druggist or might hurt his wife. Stage three: something like "well, I have
+a closer relationship to my wife than the druggist; I care more about my
+wife, so I should steal it". Stage four: it's against the law, and I
+shouldn't break the law. Stage five: like stage three, generalized to
+larger community: how much will it hurt my wife not to get the drug? A
+lot. How much will it hurt the druggist if I steal it? Some money. Stage
+six, based not on laws of community, but rather on the standards of the
+community. Odd-numbered stages are about specific people. Even-numbered
+stages are about society and rules (punishment if I do it to it's the law
+to it's what people expect of me).</p>
+<p>Right now I'm talking about the literature of moral psychology: people go
+through these stages (different ways of thinking). Question posed is not
+"how do people behave", but rather "how should people behave".</p>
+<p>This is modern ethical reasoning. Take some situation that has no right
+answer, and split hairs about finding a right answer somehow.</p>
+<p>Talk about flying: checklist for novices. Instructors don't use this list:
+eventually, you get to where you're looking at the entire dashboard at
+once, and things that aren't right jump out at you.</p>
+<p>Another example: take a bunch of chess pieces, put them on the board, get
+someone to look at it for a minute, and take the pieces away, and ask the
+person to reconstruct the board position. Non-chess players are terrible
+(unsurprisingly); chess grandmasters can do it if it came out of a real
+game; if you put it randomly, they're just as bad as the rest of
+us. They're not looking at individual pieces; they're looking at the board
+holistically (clusters of pieces that interact with each other).</p>
+<p>Relevance to this about ethics: we don't always know why we do things. Very
+rare that we have the luxury to figure out either what categorical
+imperative tells us or utilitarian approach. Usually we just do something.</p>
+<p>BH with weaknesses. Would be stronger if his education was less about
+thinking things through and more about doing the right thing.</p>
+<p>Our moral training is full of "Shalt Not"s. Lot more in the Bible about
+what not to do than what to do or how to live the good life (that part of
+the Bible -- gets better). We also have these laws. Hardly ever say you
+have to do something (aside from paying taxes). Mostly say what you can't
+do. Never say how to live the good life. BH thinks that serves us ill. Have
+to make decisions. Often, what you do is different from what you say you
+should do.</p></div><div class='pos'></div>
<script src='mathjax/unpacked/MathJax.js?config=default'></script>
<script type="text/x-mathjax-config">
MathJax.Hub.Register.StartupHook("TeX Jax Ready",function () {
View
393 ee221a.html
@@ -852,7 +852,398 @@
necessarily in the space. Example: any continued fraction.</p>
<p>To show (1), we'll show that this sequence <mathjax>$\{x_m\}$</mathjax> that we constructed is
a Cauchy sequence in a Banach space. Interestingly, it matters what norm
-you choose.</p></div><div class='pos'></div>
+you choose.</p>
+<p><a name='8'></a></p>
+<h1>EE 221A: Linear System Theory</h1>
+<h2>September 18, 2012</h2>
+<p>Today:</p>
+<ul>
+<li>proof of existence and uniqueness theorem.</li>
+<li>[ if time ] introduction to dynamical systems.</li>
+</ul>
+<p>First couple of weeks of review to build up basic concepts that we'll be
+drawing upon throughout the course. Either today or Thursday we will launch
+into linear system theory.</p>
+<p>We're going to recall where we were last time. We had the fundamental
+theorem of differential equations, which said the following: if we had a
+differential equation, <mathjax>$\dot{x} = f(x,t)$</mathjax>, with initial condition <mathjax>$x(t_0) =
+x_0$</mathjax>, where <mathjax>$x(t) \in \Re^n$</mathjax>, etc, if <mathjax>$f( \cdot , t)$</mathjax> is Lipschitz
+continuous, and <mathjax>$f(x, \cdot )$</mathjax> is piecewise continuous, then there exists a
+unique solution to the differential equation / initial condition pair (some
+function <mathjax>$\phi(t)$</mathjax>) wherever you can take the derivative (may not be
+differentiable everywhere: loses differentiability on the points where
+discontinuities exist).</p>
+<p>We spent quite a lot of time discussing Lipschitz continuity. Job is
+usually to test both conditions; first one requires work. We described a
+popular candidate function by looking at the mean value theorem and
+applying it to <mathjax>$f$</mathjax>: a norm of the Jacobian function provides a candidate
+Lipschitz if it works.</p>
+<p>We also described local Lipschitz continuity, and often, when using a norm
+of the Jacobian, that's fairly easy to show.</p>
+<p>Important point to recall: a norm of the Jacobian of <mathjax>$f$</mathjax> provides a
+candidate Lipschitz function.</p>
+<p>Another important thing to say here is that we can use any norm we want, so
+we can be creative in our choice of norm when looking for a better bound.</p>
+<p>We started our proof last day, and we talked a little about the structure
+of the proof. We are going to proceed by constructing a sequence of
+functions, then show (1) that it converges to a solution, then show (2)
+that it is unique.</p>
+<h2>Proof of Existence</h2>
+<p>We are going to construct this sequence of functions as follows:
+<mathjax>$x_{m+1}(t) = x_0 + \int_0^t f(x_m(\tau)) d\tau$</mathjax>. Here we're dealing with
+an arbitrary interval from <mathjax>$t_1$</mathjax> to <mathjax>$t_2$</mathjax>, and so <mathjax>$0 \in [t_1, t_2]$</mathjax>. We
+want to show that this sequence is a Cauchy sequence, and we're going to
+rely on our knowledge that the space these functions are defined in is a
+Banach space (hence this sequence converges to something in the space).</p>
+<p>We have to put a norm on the set of reals, so we'll use the infinity
+norm. Not going to prove it, but rather state it's a Banach space. If we
+show that this is a Cauchy sequence, then the limit of that Cauchy sequence
+exists in the space. The reason that's interesting is that it's this limit
+that provides a candidate for this differential equation.</p>
+<p>We will then prove that this limit satisfies the DE/IC pair. That is
+adequate to show existence. We'll then go on to prove uniqueness.</p>
+<p>Our immediate goal is to show that this sequence is Cauchy, which is, we
+should show <mathjax>$\exists m \st (x_{m+p} - x_m) \to 0$</mathjax> as <mathjax>$m$</mathjax> gets large.</p>
+<p>First let us look at the difference between <mathjax>$x_{m+1}$</mathjax> and <mathjax>$x_m$</mathjax>. Just
+functions of time, and we can compute this. <mathjax>$\mag{x_{m+1} - x_m} =
+\int_{t_0}^t (f(x_m, \tau) - f(x_{m+1}, \tau)) d\tau$</mathjax>. Use the fact that f
+is Lipschitz continuous, and so it is <mathjax>$\le k(\tau)\mag{x_m(\tau) -
+x_{m+1}(\tau)} d\tau$</mathjax>. The function is Lipschitz, so well-defined, and it
+has a supremum in this interval. Let <mathjax>$\bar{k}$</mathjax> be the supremum of <mathjax>$k$</mathjax> over
+the whole interval <mathjax>$[t_1, t_2]$</mathjax>. This means that we can take this
+inequality and rewrite as <mathjax>$\mag{x_{m+1} - x_m} \le \bar{k} \int_{t_0}^t
+\mag{x_m(\tau) - x_{m+1}(\tau)} d\tau$</mathjax>. Now we have a bound that relates
+the bound between <mathjax>$x_m$</mathjax> and <mathjax>$x_{m+1}$</mathjax>. You can essentially relate the
+distance we've just related between two subsequent elements to some further
+distance by counting.</p>
+<p>Let us do two things: sort out the integral on the right-hand-side, then
+look at arbitrary elements beyond an index.</p>
+<p>We know that <mathjax>$x_1(t) = x_0 + \int_{t_0}^t f(x_0, \tau) d\tau$</mathjax>, and that <mathjax>$x_1
+- x_0 \le \int_{t_0}^{t} \mag{f(x_0, \tau)} d\tau \le \int_{t_1}{t_2}
+ \mag{f(x_0, \tau) d\tau} \defequals M$</mathjax>. From the above inequalities,
+ <mathjax>$\mag{x_2 - x_1} \le M \bar{k}\abs{t - t_0}$</mathjax>. Now I can look at general
+ bounds: <mathjax>$x_3 - x_2 \le \frac{M\bar{k}^2 \abs{t - t_0}^2}{2!}$</mathjax>. In general,
+ <mathjax>$x_{m+1} - x_m \le \frac{M\parens{\bar{k} \abs{t - t_0}}^m}{m!}$</mathjax>.</p>
+<p>If we look at the norm of <mathjax>$\dot{x}$</mathjax>, that is going to be a function
+norm. What I've been doing up to now is look at a particular value <mathjax>$t_1 &lt; t
+&lt; t_2$</mathjax>.</p>
+<p>Try to relate this to the norm <mathjax>$\mag{x_{m+1} - x_m}_\infty$</mathjax>. Can what we've
+done so far give us a bound on the difference between two functions? We
+can, because the infinity norm of a function is the maximum value that the
+function assumes (maximum vector norm for all points <mathjax>$t$</mathjax> in the interval
+we're interested in). If we let <mathjax>$T$</mathjax> be the difference between our larger
+bound <mathjax>$t_2 - t_1$</mathjax>, we can use the previous result on the pointwise norm,
+then a bound on the function norm has to be less than the same
+bound, i.e. if a pointwise norm function is less than this bound for all
+relevant <mathjax>$t$</mathjax>, then its max value must be less than this bound.</p>
+<p>That gets us on the road we want to be, since that now gets us a bound. We
+can now go back to where we started. What we're actually interested in is
+given an index <mathjax>$m$</mathjax>, we can construct a bound on all later elements in the
+sequence.</p>
+<p><mathjax>$\mag{x_{m+p} - x_m}_\infty = \mag{x_{m+p} + x_{m+p-1} - x_{m+p-1} + ... -
+x_m} = \mag{\sum_{k=0}^{p-1} (x_{m+k+1} - x_{m+k})} \le M \sum_{k=0}^{p-1}
+\frac{(\bar{k}T)^{m+k}}{(m+k)!}$</mathjax>.</p>
+<p>We're going to recall a few things from undergraduate calculus: Taylor
+expansion of the exponential function and <mathjax>$(m+k)! \ge m!k!$</mathjax>.</p>
+<p>With these, we can say that <mathjax>$\mag{x_{m+p} - x_m}_\infty \le
+M\frac{(\bar{k}T)^m}{m!} e^{\bar{k} T}$</mathjax>. What we'd like to show is that this
+can be made arbitrarily small as <mathjax>$m$</mathjax> gets large. We study this bound as <mathjax>$m
+\to \infty$</mathjax>, and we recall that we can use the Stirling approximation,
+which shows that factorial grows faster than the exponential function. That
+is enough to show that <mathjax>$\{x_m\}_0^\infty$</mathjax> is Cauchy. Since it is in a
+Banach space (not proving, since beyond our scope), it converges to
+something in the space to a function (call it <mathjax>$x^\ell$</mathjax>) in the same
+space.</p>
+<p>Now we just need to show that the limit <mathjax>$x^\ell$</mathjax> solves the differential
+equation (and initial condition). Let's go back to the sequence that
+determines <mathjax>$x^\ell$</mathjax>. <mathjax>$x_{m+1} = x_0 + \int_{t_0}^t f(x_m, \tau)
+d\tau$</mathjax>. We've proven that this limit converges to <mathjax>$x^\ell$</mathjax>. What we want to
+show is that if we evaluate <mathjax>$f(x^\ell, t)$</mathjax>, then <mathjax>$\int_{t_0}^t f(x_m, \tau)
+\to \int_{t_0}^t f(x^\ell, \tau) d\tau$</mathjax>. Would be immediate if we had that
+the function were continuous. Clear that it satisfies initial condition by
+the construction of the sequence, but we need to show that it satisfies the
+differential equation. Conceptually, this is probably more difficult than
+what we've just done (establishing bounds, Cauchy sequences). Thinking
+about what that function limit is and what it means for it to satisfy that
+differential equation.</p>
+<p>Now, you can basically use some of the machinery we've been using all along
+to show this. Difference between these goes to <mathjax>$0$</mathjax> as <mathjax>$m$</mathjax> gets large.</p>
+<p><mathjax>$$\mag{\int_{t_0}^t (f(x_m, \tau) f(x^\ell, \tau)) d\tau}
+\\ \le \int_{t_0}^t k(\tau) \mag{x_m - x^\ell} d\tau \le \bar{k}\mag{x_m - x^\ell}_\infty T
+\\ \le \bar{k} M e^{\bar{k} T} \frac{(\bar{k} T)^m}{m!}T
+$$</mathjax></p>
+<p>Thus <mathjax>$x^\ell$</mathjax> solves the DE/IC pair. A solution <mathjax>$\Phi$</mathjax> is <mathjax>$x^\ell$</mathjax>,
+i.e. <mathjax>$x^\ell(t) = f(x^\ell, t) \forall [t_1, t_2] - D$</mathjax> and <mathjax>$x^\ell(t_0) =
+x_0$</mathjax></p>
+<p>To show that this solution is unique, we will use the Bellman-Gronwall
+lemma, which is very important. Used ubiquitously when you want to show
+that functions of time are equal to each other: candidate mechanism to do
+that.</p>
+<h2>Bellman-Gronwall Lemma</h2>
+<p>Let <mathjax>$u, k$</mathjax> be real-valued positive piece-wise continuous functions of time,
+and we'll have a constant <mathjax>$c_1 \ge 0$</mathjax> and <mathjax>$t_0 \ge 0$</mathjax>. If we have such
+constants and functions, then the following is true: if <mathjax>$u(t) \le c_1 +
+\int_{t_0}^t k(\tau)u(\tau) d\tau$</mathjax>, then <mathjax>$u(t) \le c_1 e^{\int_{t_0}^t
+k(\tau) d\tau}$</mathjax>.</p>
+<h2>Proof (of B-G)</h2>
+<p><mathjax>$t &gt; t_0$</mathjax> WLOG.</p>
+<p><mathjax>$$U(t) = c_1 + \int_{t_0}^t k(\tau) u(\tau) d\tau
+\\ u(t) \le U(t)
+\\ u(t)k(t)e^{\int_{t_0}^t k(\tau) d\tau} \le U(t)k(t)e^{\int_{t_0}^t k(\tau) d\tau}
+\\ \deriv{}{t}\parens{U(t)e^{\int_{t_0}^t k(\tau) d\tau}} \le 0 \text{(then integrate this derivative, note that U(t_0) = c_1)}
+\\ u(t) \le U(t) \le c_1 e^{\int_{t_0}^t k(\tau) d\tau}
+$$</mathjax></p>
+<h2>Using this to prove uniqueness of DE/IC solutions</h2>
+<p>How we're going to use this to prove B-G lemma.</p>
+<p>We have a solution that we constructed <mathjax>$\Phi$</mathjax>, and someone else gives us a
+solution <mathjax>$\Psi$</mathjax>, constructed via a different method. Show that these must
+be equivalent. Since they're both solutions, they have to satisfy the DE/IC
+pair. Take the norm of the difference between the differential equations.</p>
+<p><mathjax>$$\mag{\Phi - \Psi} \le \bar{k} \int_{t_0}^t \mag{\Phi - \Psi} d\tau \forall
+t_0, t \in [t_1, t_2]$$</mathjax></p>
+<p>From the Bellman-Gronwall Lemma, we can rewrite this inequality as
+<mathjax>$\mag{\Phi - \Psi} \le c_1 e^{\bar{k}(t - t_0)}$</mathjax>. Since <mathjax>$c_1 = 0$</mathjax>, this
+norm is less than or equal to 0. By positive definiteness, this norm must
+be equal to 0, and so the functions are equal to each other.</p>
+<h2>Reverse time differential equation</h2>
+<p>We think about time as monotonic (either increasing or decreasing, usually
+increasing). Suppose that time is decreasing. <mathjax>$\exists \dot{x} =
+f(x,t)$</mathjax>. Going backwards in time, explore existence and uniqueness going
+backwards in time. Suppose we had a time variable <mathjax>$\tau$</mathjax> which goes from
+<mathjax>$t_0$</mathjax> backwards, and defined <mathjax>$\tau \defequals t_0 - t$</mathjax>. We want to define
+the solution to that differential equation backwards in time as <mathjax>$z(\tau) =
+x(t)$</mathjax> if <mathjax>$t &lt; t_0$</mathjax>. Derive what reverse order time derivative is. Equation
+is just <mathjax>$-f$</mathjax>; we're going to use <mathjax>$\bar{f}$</mathjax> to represent this
+function (<mathjax>$\deriv{}{\tau}z = -\deriv{}{t}x = -f(x, t) = -f(z, \tau) =
+\bar{f}$</mathjax>).</p>
+<p>This equation, if I solve the reverse time differential equation, we'll
+have some corresponding backwards solution. Concluding statement: can think
+about solutions forwards and backwards in time. Existence of unique
+solution forward in time means existence of unique solution backward in
+time (and vice versa). You can't have solutions crossing themselves in
+time-invariant systems.</p>
+<p><a name='9'></a></p>
+<h1>EE 221A: Linear System Theory</h1>
+<h2>September 20, 2012</h2>
+<p>Introduction to dynamical systems. Suppose we have equations <mathjax>$\dot{x} =
+f(x, u, t)$</mathjax>, <mathjax>$\fn{f}{\Re^n \times \Re^n \times \Re_+}{\Re^n}$</mathjax> and <mathjax>$y = h(x,
+u, t)$</mathjax>, <mathjax>$\fn{h}{\Re^n \times \Re^n \times \Re_+}{\Re^n}$</mathjax>. We define <mathjax>$n_i$</mathjax> as
+the dimension of the input space, <mathjax>$n_o$</mathjax> as dimension of the output space,
+and <mathjax>$n$</mathjax> as the dimension of the state space.</p>
+<p>We've looked at the form, and if we specify a particular <mathjax>$\bar{u}(t)$</mathjax> over some
+time interval of interest, then we can plug this into the right hand side
+of this differential equation. Typically we do not supply a particular
+input. Thinking about solutions to this differential equation, for now,
+let's suppose that it's specified.</p>
+<p>Suppose we have some feedback function of the state. If <mathjax>$u$</mathjax> is specified,
+as long as <mathjax>$\bar{f}$</mathjax> satisfies the conditions for the existence and
+uniqueness theorem, we have a differential equation we can solve.</p>
+<p>Another example: instead of differential equation (which corresponds to
+continuous time), we have a difference equation (which corresponds to
+discrete time).</p>
+<p>Example: dynamic system represented by an LRC circuit. One practical way to
+define the state <mathjax>$x$</mathjax> is as a vector of elements whose derivatives appear in
+our differential equation. Not formal, but practical for this example.</p>
+<p>Notions of discretizing.</p>
+<h2>What is a dynamical system?</h2>
+<p>As discussed in first lecture, we consider time <mathjax>$\Tau$</mathjax> to be a privileged
+variable. Based on our definition of time, the inputs and outputs are all
+functions of time.</p>
+<p>Now we're going to define a <strong>dynamical system</strong> as a 5-tuple: <mathjax>$(\mathcal{U},
+\Sigma, \mathcal{Y}, s, r)$</mathjax> (input space, state space, output space, state
+transition function, output map).</p>
+<p>We define the <strong>input space</strong> as the set of input functions over time to an
+input set <mathjax>$U$</mathjax> (i.e. <mathjax>$\mathcal{U} = \{\fn{u}{\Tau}{U}\}$</mathjax>. Typically, <mathjax>$U =
+\Re^{n_i}$</mathjax>).</p>
+<p>We also define the <strong>output space</strong> as the set of output functions over time to
+an output set <mathjax>$Y$</mathjax> (i.e. <mathjax>$\mathcal{Y} = \{\fn{y}{\Tau}{Y}\}$</mathjax>). Typically, <mathjax>$Y
+= \Re^{n_o}$</mathjax>.</p>
+<p><mathjax>$\Sigma$</mathjax> is our <strong>state space</strong>. Not defined as the function, but the actual
+state space. Typically, <mathjax>$\Sigma = \Re^n$</mathjax>, and we can go back and think
+about the function <mathjax>$x(t) \in \Sigma$</mathjax>. <mathjax>$\fn{x}{\Tau}{\Sigma}$</mathjax> is called the
+state trajectory.</p>
+<p><mathjax>$s$</mathjax> is called the <strong>state transition function</strong> because it defines how the
+state changes in response to time and the initial state and the
+input. <mathjax>$\fn{s}{\Tau \times \Tau \times \Sigma \times U }{\Sigma}$</mathjax>. Usually
+we write this as <mathjax>$x(t_1) = s(t_1, t_0, x_0, u)$</mathjax>, where <mathjax>$u$</mathjax> is the function
+<mathjax>$u(\cdot) |_{t_0}^{t_1}$</mathjax>. This is important: coming towards how we define
+state. Only things you need to get to state at the new time are the initial
+state, inputs, and dynamics.</p>
+<p>Finally, we have this <strong>output map</strong> (sometimes called the readout map)
+<mathjax>$r$</mathjax>. <mathjax>$\fn{r}{\Tau \times \Sigma \times U}{Y}$</mathjax>. That is, we can think about
+<mathjax>$y(t) = r(t, x(t), u(t))$</mathjax>. There's something fundamentally different
+between <mathjax>$r$</mathjax> and <mathjax>$s$</mathjax>. <mathjax>$s$</mathjax> depended on the function <mathjax>$u$</mathjax>, whereas <mathjax>$r$</mathjax> only
+depended on the current value of <mathjax>$u$</mathjax> at a particular time.</p>
+<p><mathjax>$s$</mathjax> captures dynamics, while <mathjax>$r$</mathjax> is static. Remark: <mathjax>$s$</mathjax> has dynamics
+(memory) -- things that depend on previous time, whereas <mathjax>$r$</mathjax> is static:
+everything it depends on is at the current time (memoryless).</p>
+<p>In order to be a dynamical system, we need to satisfy two axioms: a
+dynamical system is a five-tuple with the following two axioms:</p>
+<ul>
+<li>The <strong>state transition axiom</strong>: <mathjax>$\forall t_1 \ge t_0$</mathjax>, given <mathjax>$u, \tilde{u}$</mathjax>
+ that are equal to each other over a particular time interval, the state
+ transition functions must be equal over that interval, i.e. <mathjax>$s(t_1, t_0,
+ x_0, u) = s(t_1, t_0, x_0, \tilde{u})$</mathjax>. Requires us to not have
+ dependence on the input outside of the time interval of interest.</li>
+<li>The <strong>semigroup axiom</strong>: suppose you start a system at <mathjax>$t_0$</mathjax> and evolve it to
+ <mathjax>$t_2$</mathjax>, and you're considering the state. You have an input <mathjax>$u$</mathjax> defined
+ over the whole time interval. If you were to look at an intermediate
+ point <mathjax>$t_1$</mathjax>, and you computed the state at <mathjax>$t_1$</mathjax> via the state transition
+ function, we can split our time interval into two intervals, and we can
+ compute the result any way we like. Stated as the following: <mathjax>$s(t_2, t_1,
+ s(t_1, t_0, x_0, u), u) = s(t_2, t_0, x_0, u)$</mathjax>.</li>
+</ul>
+<p>When we talk about a dynamical system, we have to satisfy these axioms.</p>
+<h2>Response function</h2>
+<p>Since we're interested in the outputs and not the states, we can define
+what we call the <strong>response map</strong>. It's not considered part of the definition
+of a dynamical system because it can be easily derived.</p>
+<p>It's the composition of the state transition function and the readout map,
+i.e. <mathjax>$y(t) = r(t, x(t), u(t)) = r(t, s(t, t_0, x_0, u), u(t)) \defequals
+\rho(t, t_0, x_0, u)$</mathjax>. This is an important function because it is used to
+define properties of a dynamical system. Why is that? We've said that
+states are somehow mysterious. Not something we typically care about:
+typically we care about the outputs. Thus we define properties like
+linearity and time invariance.</p>
+<h2>Time Invariance</h2>
+<p>We define a time-shift operator <mathjax>$\fn{T_\tau}{\mathcal{U}}{\mathcal{U}}$</mathjax>,
+<mathjax>$\fn{T_\tau}{\mathcal{Y}}{\mathcal{Y}}$</mathjax>. <mathjax>$(T_\tau u)(t) \defequals u(t -
+\tau)$</mathjax>. Namely, the value of <mathjax>$T_\tau u$</mathjax> is that of the old signal at
+<mathjax>$t-\tau$</mathjax>.</p>
+<p>A <strong>time-invariant</strong> (dynamical) system is one in which the input space and
+output space are closed under <mathjax>$T_\tau$</mathjax> for all <mathjax>$\tau$</mathjax>, and <mathjax>$\rho(t, t_0,
+x_0, u) = \rho(t + \tau, t_0 + \tau, x_0, T_\tau u)$</mathjax>.</p>
+<h2>Linearity</h2>
+<p>A <strong>linear</strong> dynamical system is one in which the input, state, and output
+spaces are all linear spaces over the same field <mathjax>$\mathbb{F}$</mathjax>, and the
+response map <mathjax>$\rho$</mathjax> is a linear map of <mathjax>$\Sigma \times \mathcal{U}$</mathjax> into
+<mathjax>$\mathcal{Y}$</mathjax>.</p>
+<p>This is a strict requirement: you have to check that the response map
+satisfies these conditions. Question that comes up: why do we define
+linearity of a dynamical system in terms of linearity of the response and
+not the state transition function? Goes back to a system being
+intrinsically defined by its inputs and outputs. Often states, you can have
+many different ways to define states. Typically we can't see all of
+them. It's accepted that when we talk about a system and think about its
+I/O relations, it makes sense that we define linearity in terms of this
+memory function of the system, as opposed to the state transition function.</p>
+<p>Let's just say a few remarks about this: <strong>zero-input response</strong>,
+<strong>zero-state response</strong>. If we look at the zero element in our spaces (so
+we have a zero vector), then we can take our superposition, which implies
+that the response at time <mathjax>$t$</mathjax> is equal to the zero-state response, which is
+the response, given that we started at the zero state, plus the zero input
+response.</p>
+<p>That is: <mathjax>$\rho(t, t_0, x_0, u) = \rho(t, t_0, \theta_x, u) + \rho(t, t_0,
+x_0, \theta_u)$</mathjax> (from the definition of linearity).</p>
+<p>The second remark is that the zero-state response is linear in the input,
+and similarly, the zero-input response is linear in the state.</p>
+<p>One more property of dynamical systems before we finish: <strong>equivalence</strong> (a
+property derived from the definition). Take two dynamical systems <mathjax>$D = (U,
+\Sigma, Y, s, r), \tilde{D} = (U, \bar{\Sigma}, Y, \bar{s}, \bar{r})$</mathjax>. <mathjax>$x_0
+\in D$</mathjax> is equivalent to <mathjax>$\tilde{x_0} \in \tilde{D}$</mathjax> at <mathjax>$t_0$</mathjax>. If <mathjax>$\forall t
+\ge t_0, \rho(t, t_0, x_0, u) = \tilde{\rho}(t, t_0, \tilde{x_0}, u)$</mathjax>
+<mathjax>$\forall x$</mathjax> and some <mathjax>$\tilde{x}$</mathjax>, the two systems are equivalent.</p>
+<p><a name='10'></a></p>
+<h1>EE 221A: Linear System Theory</h1>
+<h2>September 25, 2012</h2>
+<h2>Linear time-varying systems</h2>
+<p>Recall the state transition function is given some function of the current
+time with initial state, initial time, and inputs, Suppose you have a
+differential equation; how do you acquire the state transition function?
+Solve the differential equation.</p>
+<p>For a general dynamical system, there are different ways to get the state
+transition function. This is an instantiation of a dynamical system, and
+we're going to ge thte state transition function by solving the
+differential equation / initial condition pair. </p>
+<p>We're going to call <mathjax>$\dot{x}(t) = A(t)x(t) + B(t)u(t)$</mathjax> a vector
+differential equation with initial condition <mathjax>$x(t_0) = x_0$</mathjax>.</p>
+<p>So that requires us to think about solving that differential equation. Do a
+dimension check, to make sure we know the dimensions of the matrices. <mathjax>$x
+\in \Re^n$</mathjax>, so <mathjax>$A \in \Re^{n_0 \times n}$</mathjax>. We could define the matrix
+function <mathjax>$A$</mathjax>, which takes intervals of the real line and maps them over to
+matrices. As a function, <mathjax>$A$</mathjax> is piecewise continuous matrix function in
+time.</p>
+<p>The entries are piecewise-continuous scalars in time. We would like to get
+at the state transition function; to do that, we need to solve the
+differential equation.</p>
+<p>Let's assume for now that <mathjax>$A, B, U$</mathjax> are given (part of the system
+definition).</p>
+<p>Piece-wise continuous is trivial; we can use the induced norm of <mathjax>$A$</mathjax> for a
+Lipschitz condition. Since this induced norm is piecewise-continuous in
+time, this is a fine bound. Therefore <mathjax>$f$</mathjax> is globally Lipschitz continuous.</p>
+<p>We're going to back off for a bit and introduce the state transition
+matrix. Background for solving the VDE. We're going to introduce a matrix
+differential equation, <mathjax>$\dot{X} = A(t) X$</mathjax> (where <mathjax>$A(t)$</mathjax> is same as before).</p>
+<p>I'm going to define <mathjax>$\Phi(t, t_0)$</mathjax> as the solution to the matrix
+differential equation (MDE) for the initial condition <mathjax>$\Phi(t_0, t_0) =
+1_{n \times n}$</mathjax>. I'm going to define <mathjax>$\Phi$</mathjax> as the solution to the <mathjax>$n
+\times n$</mathjax> matrix when my differential equation starts out in the identity
+matrix.</p>
+<p>Let's first talk about properties of this matrix <mathjax>$\Phi$</mathjax> just from the
+definition we have.</p>
+<ul>
+<li>If you go back to the vector differential equation, and let's just drop
+ the term that depends on <mathjax>$u$</mathjax> (either consider <mathjax>$B$</mathjax> to be 0, or the input
+ to be 0), the solution of <mathjax>$\cdot{x} = A(t)x(t)$</mathjax> is given by <mathjax>$x(t) =
+ \Phi(t, t_0)x_0$</mathjax>.</li>
+<li>This is what we call the semigroup property, since it's reminiscent of
+ the semigroup axiom. <mathjax>$\Phi(t, t_0) = \Phi(t, t_1) \Phi(t_1, t_0) \forall
+ t, t_0, t_1 \in \Re^+$</mathjax></li>
+<li><mathjax>$\Phi^{-1}(t, t_0) = \Phi(t_0, t)$</mathjax>.</li>
+<li><mathjax>$\text{det} \Phi(t, t_0) = \exp\parens{\int_{t_0}^t \text{tr} \parens{A
+ (\tau)} d\tau}$</mathjax>.</li>
+</ul>
+<p>Here's let's talk about some machinery we can now invoke when
+we want to show that two functions of time are equal to each other when
+they're both solutions to the differential equation. You can simply show by
+the existence and uniqueness theorem (assuming it applies) that they
+satisfy the same initial condition and the same differential
+equation. That's an important point, and we tend to use it a lot.</p>
+<p>(i.e. when faced with showing that two functions of time are equal to each
+other, you can show that they both satisfy the same initial condition and
+the same differential equation [as long as the differential equation
+satisfies the hypotheses of the existence and uniqueness theorem])</p>
+<p>Obvious, but good to state.</p>
+<p>Note: the initial condition doesn't have to be the initial condition given;
+it just has to hold at one point in the interval. Pick your point in time
+judiciously.</p>
+<p>Proof of (2): check <mathjax>$t=t_1$</mathjax>. (3) follows directly from (2). (4) you can
+look at if you want. Gives you a way to compute <mathjax>$\Phi(t, t_0)$</mathjax>. We've
+introduced a matrix differential equation and an abstract solution.</p>
+<p>Consider (1). <mathjax>$\Phi(t, t_0)$</mathjax> is a map that takes the initial state and
+transitions to the new state. Thus we call <mathjax>$\Phi$</mathjax> the <strong>state transition
+matrix</strong> because of what it does to the states of this vector differential
+equation: it transfers them from their initial value to their final value,
+and it transfers them through matrix multiplication.</p>
+<p>Let's go back to the original differential equation. Claim that the
+solution to that differential equation has the following form: <mathjax>$x(t) =
+\Phi(t, t_0)x_0 + \int_{t_0}^t \Phi(t, \tau)B(\tau)u(\tau) d\tau$</mathjax>. Proof:
+we can use the same machinery. If someone gives you a candidate solution,
+you can easily show that it is the solution.</p>
+<p>Recall the Leibniz rule, which we'll state in general as follows:
+<mathjax>$\pderiv{}{z} \int_{a(z)}^{b^z} f(x, z) dx = \int_{a(z)}^{b^z}
+\pderiv{}{x}f(x, z) dx + \pderiv{b}{z} f(b, z) - \pderiv{a}{z} f(a, z}$</mathjax>.</p>
+<p><mathjax>$$
+\dot{x}(t) &amp; = A(t) \Phi(t, t_0) x_0 + \int_{t_0}^t
+\pderiv{}{t} \parens{\Phi(t, \tau)B(\tau)u(\tau)} d\tau +
+\pderiv{t}{t}\parens{\Phi(t, t)B(t)u(t)} - \pderiv{t_0}{t}\parens{...}
+\\ &amp; = A(t)\Phi(t, t_0)x_0 + \int_{t_0}^t A(t)\Phi(t,\tau)B(\tau)u(\tau)d\tau + B(t)u(t)
+\\ &amp; = A(\tau)\Phi(t, t_0) x_0 + A(t)\int_{t_0}^t \Phi(t, \tau)B(\tau)
+u(\tau) d\tau + B(t) u(t)
+\\ &amp; = A(\tau)\parens{\Phi(t, t_0) x_0 + \int_{t_0}^t \Phi(t, \tau)B(\tau)
+u(\tau) d\tau} + B(t) u(t)
+$$</mathjax></p>
+<p><mathjax>$x(t) = \Phi(t,t_0)x_0 + \int_{t_0}^t \Phi(t,\tau)B(\tau)u(\tau) d\tau$</mathjax> is
+good to remember.</p>
+<p>Not surprisingly, it depends on the input function over an interval of
+time.</p>
+<p>The differential equation is changing over time, therefore the system
+itself is time-varying. No way in general that will be time-invariant,
+since the equation that defines its evolution is changing. You test
+time-invariance or time variance through the response map. But is it
+linear? You have the state transition function, so we can compute the
+response function (recall: readout map composed with the state transition
+function) and ask if this is a linear map.</p></div><div class='pos'></div>
<script src='mathjax/unpacked/MathJax.js?config=default'></script>
<script type="text/x-mathjax-config">
MathJax.Hub.Register.StartupHook("TeX Jax Ready",function () {
View
96 fa2012/cs150/10.md
@@ -0,0 +1,96 @@
+CS 150: Digital Design & Computer Architecture
+==============================================
+September 20, 2012
+------------------
+
+Non-overlapping clocks. n-phase means that you've got n different outputs,
+and at most one high at any time. Guaranteed dead time between when one
+goes low and next goes high.
+
+K-maps
+------
+Finding minimal sum-of-products and product-of-sums expressions for
+functions. **On-set**: all the ones of a function; **implicant**: one or
+more circled ones in the onset; a **minterm** is the smallest implicant you
+can have, and they go up by powers of two in the number of things you can
+have; a **prime implicant** can't be combined with another (by circling);
+an **essential prime implicant** is a prime implicant that contains at
+least one one not in any other prime implicant. A **cover** is any
+collection of implicants that contains all of the ones in the on-set, and a
+**minimal cover** is one made up of essential prime implicants and the
+minimum number of implicants.
+
+Hazards vs. glitches. Glitches are when timing issues result in dips (or
+spikes) in the output; hazards are if they might happen. Completely
+irrelevant in synchronous logic.
+
+Project
+-------
+3-stage pipeline MIPS150 processor. Serial port, graphics accelerator. If
+we look at the datapath elements, the storage elements, you've got your
+program counter, your instruction memory, register file, and data
+memory. Figure 7.1 from the book. If you mix that in with figure 8.28,
+which talks about MMIO, that data memory, there's an address and data bus
+that this is hooked up to, and if you want to talk to a serial port on a
+MIPS processor (or an ARM processor, or something like that), you don't
+address a particular port (not like x86). Most ports are
+memory-mapped. Actually got a MMIO module that is also hooked up to the
+address and data bus. For some range of addresses, it's the one that
+handles reads and writes.
+
+You've got a handful of different modules down here such as a UART receive
+module and a UART transmit module. In your project, you'll have your
+personal computer that has a serial port on it, and that will be hooked up
+to your project, which contains the MIPS150 processor. Somehow, you've got
+to be able to handle characters transmitted in each direction.
+
+UART
+----
+Common ground, TX on one side connected to RX port on other side, and vice
+versa. Whole bunch more in different connectors. Basic protocol is called
+RS232, common (people often refer to it by connector name: DB9 (rarely
+DB25); fortunately, we've moved away from this world and use USB. We'll
+talk about these other protocols later, some sync, some async. Workhorse
+for long time, still all over the place.
+
+You're going to build the UART receiver/transmitter and MMIO module that
+interfaces them. See when something's coming in from software /
+hardware. Going to start out with polling; we will implement interrupts
+later on in the project (for timing and serial IO on the MIPS
+processor). That's really the hardcore place where software and hardware
+meet. People who understand how each interface works and how to use those
+optimally together are valuable and rare people.
+
+What you're doing in Lab 4, there's really two concepts of (1) how does
+serial / UART work and (2) ready / valid handshake.
+
+On the MIPS side, you've got some addresses. Anything that starts with FFFF
+is part of the memory-mapped region. In particular, the first four are
+mapped to the UART: they are RX control, RX data, TX control, and TX data.
+
+When you want to send something out the UART, you write the byte -- there's
+just one bit for the control and one byte for data.
+
+Data goes into some FSM system, and you've got an RX shift register and a
+TX shift register.
+
+There's one other piece of this, which is that inside of here, the thing
+interfacing to this IO-mapped module uses this ready bit. If you have two
+modules: a source and a sink (diagram from the document), the source has
+some data that is sending out, tells the sink when the data is valid, and
+the sink tells the source when it is ready. And there's a shared "clock"
+(baud rate), and this is a synchronous interface.
+
+* source presents data
+* source raises valid
+* when ready & valid on posedge clock, both sides know the transaction was
+ successful.
+
+Whatever order this happens in, source is responsible for making sure data
+is valid.
+
+HDLC? Takes bytes and puts into packets, ACKs, etc.
+
+Talk about quartz crystals, resonators. $\pi \cdot 10^7$.
+
+So: before I let you go, parallel load, n bits in, serial out, etc.
View
5 fa2012/cs150/11.md
@@ -0,0 +1,5 @@
+CS 150: Digital Design & Computer Architecture
+==============================================
+September 25, 2012
+------------------
+
View
2  fa2012/cs150/3.md
@@ -1,5 +1,5 @@
CS 150: Digital Design & Computer Architecture
-===============================================
+==============================================
August 28, 2012
---------------
View
2  fa2012/cs150/4.md
@@ -1,5 +1,5 @@
CS 150: Digital Design & Computer Architecture
-===============================================
+==============================================
August 30, 2012
---------------
View
2  fa2012/cs150/5.md
@@ -1,5 +1,5 @@
CS 150: Digital Design & Computer Architecture
-===============================================
+==============================================
September 4, 2012
-----------------
View
2  fa2012/cs150/6.md
@@ -1,5 +1,5 @@
CS 150: Digital Design & Computer Architecture
-===============================================
+==============================================
September 6, 2012
-----------------
View
2  fa2012/cs150/7.md
@@ -1,5 +1,5 @@
CS 150: Digital Design & Computer Architecture
-===============================================
+==============================================
September 11, 2012
------------------
View
2  fa2012/cs150/8.md
@@ -1,5 +1,5 @@
CS 150: Digital Design & Computer Architecture
-===============================================
+==============================================
September 13, 2012
------------------
View
83 fa2012/cs150/9.md
@@ -0,0 +1,83 @@
+CS 150: Digital Design & Computer Architecture
+==============================================
+September 18, 2012
+------------------
+
+Lab this week you are learning about chipscope. Chipscope is kinda like
+what it sounds: allows you to monitor things happening in the FPGA. One of
+the interesting things about Chipscope is that it's a FSM monitoring stuff
+in your FPGA, it also gets compiled down, and it changes the location of
+everything that goes into your chip. It can actually make your bug go away
+(e.g. timing bugs).
+
+So. Counters. How do counters work? If I've got a 4-bit counter and I'm
+counting from 0, what's going on here?
+
+D-ff with an inverter and enable line? This is a T-ff (toggle
+flipflop). That'll get me my first bit, but my second bit is slower. $Q_1$
+wants to toggle only when $Q_0$ is 1. With subsequent bits, they want to
+toggle when all lower bits are 1.
+
+Counter with en: enable is tied to the toggle of the first bit. Counter
+with ld: four input bits, four output bits. Clock. Load. Then we're going
+to want to do a counter with ld, en, rst. Put in logic, etc.
+
+Quite common: ripple carry out (RCO), where we AND $Q[3:0]$ and feed this
+into the enable of $T_4$.
+
+Ring counter (shift register with one hot out), If reset is low I just
+shift this thing around and make a circular shift register. If high, I clear
+the out bit.
+
+Mobius counter: just a ring counter with a feedback inverter in it. Just
+going to take whatever state in there, and after n clock ticks, it inverts
+itself. So you have $n$ flipflops, and you get $2n$ states.
+
+And then you've got LFSRs (linear feedback shift registers). Given N
+flipflops, we know that a straight up or down counter will give us $2^N$
+states. Turns out that an LFSR give syou almost that (not 0). So why do
+that instead of an up-counter? This can give you a PRNG. Fun times with
+Galois fields.
+
+Various uses, seeds, high enough periods (Mersenne twisters are higher).
+
+RAM
+---
+Remember, decoder, cell array, $2^n$ rows, $2^n$ word lines, some number of
+bit lines coming out of that cell array for I/O with output-enable and
+write-enable.
+
+When output-enable is low, D goes to high-Z. At some point, some external
+device starts driving some Din (not from memory). Then I can apply a write
+pulse (write strobe), which causes our data to be written into the memory
+at this address location. Whatever was driving it releases, so it goes back
+to high-impedance, and if we turn output-enable again, we'll see "Din" from
+the cell array.
+
+During the write pulse, we need Din stable and address stable. We have a
+pulse because we don't want to break things. Bad things happen.
+
+Notice: no clock anywhere. Your FPGA (in particular, the block ram on the
+ML505) is a little different in that it has registered input (addr &
+data). First off, very configurable. All sorts of ways you can set this up,
+etc. Addr in particular goes into a register and comes out of there, and
+then goes into a decoder before it goes into the cell array, and what comes
+out of that cell array is a little bit different also in that there's a
+data-in line that goes into a register and some data-out as well that's
+separate and can be configured in a whole bunch of different ways so that
+you can do a bunch of different things.
+
+The important thing is that you can apply your address to those inputs, and
+it doesn't show up until the rising edge of the clock. There's the option
+of having either registered or non-registered output (non-registered for
+this lab).
+
+So now we've got an ALU and RAM. And so we can build some simple
+datapaths. For sure you're going to see on the final (and most likely the
+midterm) problems like "given a 16-bit ALU and a 1024x16 sync SRAM, design
+a system to find the largest unsigned int in the SRAM."
+
+Demonstration of clock cycles, etc. So what's our FSM look like? Either
+LOAD or HOLD.
+
+On homework, did not say sync SRAM. Will probably change.
View
205 fa2012/cs150/cs150.md
@@ -197,7 +197,7 @@ stuff.
<a name='3'></a>
CS 150: Digital Design & Computer Architecture
-===============================================
+==============================================
August 28, 2012
---------------
@@ -297,7 +297,7 @@ and a maxterm is a sum containing every input variable or its complement.
<a name='4'></a>
CS 150: Digital Design & Computer Architecture
-===============================================
+==============================================
August 30, 2012
---------------
@@ -355,7 +355,7 @@ de Morgan's law: "bubble-pushing".
<a name='5'></a>
CS 150: Digital Design & Computer Architecture
-===============================================
+==============================================
September 4, 2012
-----------------
@@ -467,7 +467,7 @@ can make FSMs.
<a name='6'></a>
CS 150: Digital Design & Computer Architecture
-===============================================
+==============================================
September 6, 2012
-----------------
@@ -539,7 +539,7 @@ Next time: more MIPS, memory.
<a name='7'></a>
CS 150: Digital Design & Computer Architecture
-===============================================
+==============================================
September 11, 2012
------------------
@@ -680,7 +680,7 @@ means that this might come back into fashion.
<a name='8'></a>
CS 150: Digital Design & Computer Architecture
-===============================================
+==============================================
September 13, 2012
------------------
@@ -786,3 +786,196 @@ sums. (Section 2.7). Based on the combining theorem, which says that $XA +
X\bar{A} = X$. Ideally: every row should just have a single value
changing. So, I use Gray codes. (e.g. 00, 01, 11, 10). Graphical
representation!
+
+<a name='9'></a>
+
+CS 150: Digital Design & Computer Architecture
+==============================================
+September 18, 2012
+------------------
+
+Lab this week you are learning about chipscope. Chipscope is kinda like
+what it sounds: allows you to monitor things happening in the FPGA. One of
+the interesting things about Chipscope is that it's a FSM monitoring stuff
+in your FPGA, it also gets compiled down, and it changes the location of
+everything that goes into your chip. It can actually make your bug go away
+(e.g. timing bugs).
+
+So. Counters. How do counters work? If I've got a 4-bit counter and I'm
+counting from 0, what's going on here?
+
+D-ff with an inverter and enable line? This is a T-ff (toggle
+flipflop). That'll get me my first bit, but my second bit is slower. $Q_1$
+wants to toggle only when $Q_0$ is 1. With subsequent bits, they want to
+toggle when all lower bits are 1.
+
+Counter with en: enable is tied to the toggle of the first bit. Counter
+with ld: four input bits, four output bits. Clock. Load. Then we're going
+to want to do a counter with ld, en, rst. Put in logic, etc.
+
+Quite common: ripple carry out (RCO), where we AND $Q[3:0]$ and feed this
+into the enable of $T_4$.
+
+Ring counter (shift register with one hot out), If reset is low I just
+shift this thing around and make a circular shift register. If high, I clear
+the out bit.
+
+Mobius counter: just a ring counter with a feedback inverter in it. Just
+going to take whatever state in there, and after n clock ticks, it inverts
+itself. So you have $n$ flipflops, and you get $2n$ states.
+
+And then you've got LFSRs (linear feedback shift registers). Given N
+flipflops, we know that a straight up or down counter will give us $2^N$
+states. Turns out that an LFSR give syou almost that (not 0). So why do
+that instead of an up-counter? This can give you a PRNG. Fun times with
+Galois fields.
+
+Various uses, seeds, high enough periods (Mersenne twisters are higher).
+
+RAM
+---
+Remember, decoder, cell array, $2^n$ rows, $2^n$ word lines, some number of
+bit lines coming out of that cell array for I/O with output-enable and
+write-enable.
+
+When output-enable is low, D goes to high-Z. At some point, some external
+device starts driving some Din (not from memory). Then I can apply a write
+pulse (write strobe), which causes our data to be written into the memory
+at this address location. Whatever was driving it releases, so it goes back
+to high-impedance, and if we turn output-enable again, we'll see "Din" from
+the cell array.
+
+During the write pulse, we need Din stable and address stable. We have a
+pulse because we don't want to break things. Bad things happen.
+
+Notice: no clock anywhere. Your FPGA (in particular, the block ram on the
+ML505) is a little different in that it has registered input (addr &
+data). First off, very configurable. All sorts of ways you can set this up,
+etc. Addr in particular goes into a register and comes out of there, and
+then goes into a decoder before it goes into the cell array, and what comes
+out of that cell array is a little bit different also in that there's a
+data-in line that goes into a register and some data-out as well that's
+separate and can be configured in a whole bunch of different ways so that
+you can do a bunch of different things.
+
+The important thing is that you can apply your address to those inputs, and
+it doesn't show up until the rising edge of the clock. There's the option
+of having either registered or non-registered output (non-registered for
+this lab).
+
+So now we've got an ALU and RAM. And so we can build some simple
+datapaths. For sure you're going to see on the final (and most likely the
+midterm) problems like "given a 16-bit ALU and a 1024x16 sync SRAM, design
+a system to find the largest unsigned int in the SRAM."
+
+Demonstration of clock cycles, etc. So what's our FSM look like? Either
+LOAD or HOLD.
+
+On homework, did not say sync SRAM. Will probably change.
+
+<a name='10'></a>
+
+CS 150: Digital Design & Computer Architecture
+==============================================
+September 20, 2012
+------------------
+
+Non-overlapping clocks. n-phase means that you've got n different outputs,
+and at most one high at any time. Guaranteed dead time between when one
+goes low and next goes high.
+
+K-maps
+------
+Finding minimal sum-of-products and product-of-sums expressions for
+functions. **On-set**: all the ones of a function; **implicant**: one or
+more circled ones in the onset; a **minterm** is the smallest implicant you
+can have, and they go up by powers of two in the number of things you can
+have; a **prime implicant** can't be combined with another (by circling);
+an **essential prime implicant** is a prime implicant that contains at
+least one one not in any other prime implicant. A **cover** is any
+collection of implicants that contains all of the ones in the on-set, and a
+**minimal cover** is one made up of essential prime implicants and the
+minimum number of implicants.
+
+Hazards vs. glitches. Glitches are when timing issues result in dips (or
+spikes) in the output; hazards are if they might happen. Completely
+irrelevant in synchronous logic.
+
+Project
+-------
+3-stage pipeline MIPS150 processor. Serial port, graphics accelerator. If
+we look at the datapath elements, the storage elements, you've got your
+program counter, your instruction memory, register file, and data
+memory. Figure 7.1 from the book. If you mix that in with figure 8.28,
+which talks about MMIO, that data memory, there's an address and data bus
+that this is hooked up to, and if you want to talk to a serial port on a
+MIPS processor (or an ARM processor, or something like that), you don't
+address a particular port (not like x86). Most ports are
+memory-mapped. Actually got a MMIO module that is also hooked up to the
+address and data bus. For some range of addresses, it's the one that
+handles reads and writes.
+
+You've got a handful of different modules down here such as a UART receive
+module and a UART transmit module. In your project, you'll have your
+personal computer that has a serial port on it, and that will be hooked up
+to your project, which contains the MIPS150 processor. Somehow, you've got
+to be able to handle characters transmitted in each direction.
+
+UART
+----
+Common ground, TX on one side connected to RX port on other side, and vice
+versa. Whole bunch more in different connectors. Basic protocol is called
+RS232, common (people often refer to it by connector name: DB9 (rarely
+DB25); fortunately, we've moved away from this world and use USB. We'll
+talk about these other protocols later, some sync, some async. Workhorse
+for long time, still all over the place.
+
+You're going to build the UART receiver/transmitter and MMIO module that
+interfaces them. See when something's coming in from software /
+hardware. Going to start out with polling; we will implement interrupts
+later on in the project (for timing and serial IO on the MIPS
+processor). That's really the hardcore place where software and hardware
+meet. People who understand how each interface works and how to use those
+optimally together are valuable and rare people.
+
+What you're doing in Lab 4, there's really two concepts of (1) how does
+serial / UART work and (2) ready / valid handshake.
+
+On the MIPS side, you've got some addresses. Anything that starts with FFFF
+is part of the memory-mapped region. In particular, the first four are
+mapped to the UART: they are RX control, RX data, TX control, and TX data.
+
+When you want to send something out the UART, you write the byte -- there's
+just one bit for the control and one byte for data.
+
+Data goes into some FSM system, and you've got an RX shift register and a
+TX shift register.
+
+There's one other piece of this, which is that inside of here, the thing
+interfacing to this IO-mapped module uses this ready bit. If you have two
+modules: a source and a sink (diagram from the document), the source has
+some data that is sending out, tells the sink when the data is valid, and
+the sink tells the source when it is ready. And there's a shared "clock"
+(baud rate), and this is a synchronous interface.
+
+* source presents data
+* source raises valid
+* when ready & valid on posedge clock, both sides know the transaction was
+ successful.
+
+Whatever order this happens in, source is responsible for making sure data
+is valid.
+
+HDLC? Takes bytes and puts into packets, ACKs, etc.
+
+Talk about quartz crystals, resonators. $\pi \cdot 10^7$.
+
+So: before I let you go, parallel load, n bits in, serial out, etc.
+
+<a name='11'></a>
+
+CS 150: Digital Design & Computer Architecture
+==============================================
+September 25, 2012
+------------------
+
View
105 fa2012/cs_h195/3.md
@@ -2,5 +2,108 @@ CS H195: Ethics with Harvey
===========================
September 17, 2012
------------------
-
Lawsuit to get records about NSA's surveillance information.
+
+Video games affecting people, evidently.
+
+Government subpoenaed Twitter to give people tweets.
+
+Records can be subpoenaed in a court case, etc. We'll see how this plays
+out. Today, in today's Daily Cal, UCB suing big companies. Universities do
+research, etc. Back in the day, core memory meant people paid money to IBM
+and MIT. Berkeley holds a bunch of patents. Non-software seems reasonable.
+
+Important point: the burst of genius is very rarely true. Enabling
+technologies have reached the point of making things feasible. Usual story
+about inventions. Flash bulb in a camera, single-use: before sustainable
+light bulb. Steam engine. Some inventions aren't like that. Some really do
+just come to somebody (velcro, xerography). Nobody else was working on
+that. More often, everyone is thinking about this stuff.
+
+IP. A patent is the right to develop an invention, to produce things
+dependent on an invention. Copyright is not about invention, it's about
+creative and artistic works. And there, if you have an idea and write about
+it, other people are allowed to use your ideas, not your words. Trademark,
+you know what it is; you can register one; people are not allowed to use it
+in ways that might confuse people. You can in principle make a vacuum
+cleaner called "time". How close do things have to be to raise a lawsuit?
+Lawsuit about Apple Computers vs Apple Records. Later did, which caused a
+later round of battling.
+
+Personal likeness, I can't take a picture of you and publish it with
+certain exceptions. Most important for famous people. Funny rules:
+newsworthy, and news photographers are allowed to take pictures of
+newsworthy people.
+
+Trade secrets: if a company has secrets, and you are a competing company,
+you may not send a spy to extract these secrets.
+
+House ownership. There are houses where people have had houses for
+millennia. Patents and copyrights are not like that: not a right. Those
+things are bargains between creators and society. Purpose to society is
+that these eventually belong to the public. One of the readings talks about
+a different history of patents quoting Italian legal scholars, and if
+correct, patents were supposed to be permanent ownership. Why might it be
+good to society? Used to be people who made new inventions. Guilds. Hard to
+join, and you would be a slave for a while. Master would teach apprentice
+the trade, and advantage was that it reduced competition. Trouble was that
+there is a long history of things people used to be able to do that we
+can't anymore. Textbook example: Stradivarius violins.
+
+Nonetheless, nobody knows how Stradivarius made violins. Stories about how
+to make paints of particular colors. What the patent system is trying to
+avoid. Describe how invention works so someone in the field can create
+it. By making this disclosure, you are given a limited-term exclusive right
+to make these.
+
+The thing is, sooner or later, your technology is going to be obsolete. To
+your advantage to have a clear legal statement.
+
+Patent treaties. Used to be that if you invented something important, you'd
+hire a bunch of lawyers.
+
+Until recently, software was not patentable. ATT wanted to patent the
+SETUID bit. In those days, you could not patent any math or software or
+algorithm.
+
+Patents stifling innovation in the field. When you file a patent
+application. Let's say you deny the patent. You would like to fall back on
+trade secrecy. Patent applications are secret until approved. Startups
+doomed. Wouldn't matter if term were short compared to innovation cycle of
+the industry.
+
+Another thing in the Constitution is that treaties take precedence over
+domestic laws.
+
+So let's talk about copyrights! So. Nobody says let's do away with
+copyright altogether. Copyright (at its worst) is less socially harmful
+than patents because it's so specific. Again, copyrights are a
+bargain. Started in Britain between the King and printers. Printers wanted
+exclusive right to things they printed. King wanted printers to be
+censors. Originally not authors who had copyright, but the publisher. Often
+creators of rights will sell the rights to publishers.
+
+This is where computers come in. How to sell to world? Used to need big
+company with facilities to create copies and widely
+distribute. Self-publish: work available to everyone. Important: rarely
+author who complains about copyrights. Usually publishers.
+
+There's always been piracy, but limited historically by analog media losing
+information when copying.
+
+Term of copyright has gotten longer and longer. Lawsuit about this about
+the most recent extension. In effect, making permanent copyright, against
+constitution. Ironic because copyright law now would have made much of
+what made Disney rich would have been copyrighted. Lot of exceptions to
+copyright law. Fair use. e.g. cannot write a Harry Potter novel, but can
+write a Harry Potter parody. Famous case: Gone with the Wind. About how
+wonderful life was for the owners of slaves. Someone wrote a book
+(retelling from slave's point of view); ruled as fair use (political
+commentary, protected by free speech).
+
+Stallman actually invented a system that has 5 different categories of
+work. Even Stallman doesn't say to ditch copyright. Hardly any musicians
+make any money selling music because their contracts say that they make a
+certain percentage of net proceeds. The way musicians survive is concerts,
+and ironically, selling concert CDs. Stallman says to make music players
+have a money button and send money directly to the musician.
View
292 fa2012/cs_h195/4.md
@@ -0,0 +1,292 @@
+CS H195: Ethics with Harvey
+===========================
+September 24, 2012
+------------------
+Vastly oversimplified picture of moral philosophy. Leaves out a lot.
+
+So Socrates says famously "to know the good is to desire the good", by
+which he means that if you really understand what's in your own interest,
+it's going to turn out to be the right thing. Counter-intuitive, since
+we've probably encountered situations in which we think what's good for us
+isn't good for the rest of the community.
+
+Ended up convicting Socrates, and he was offered the choice between exile
+from Athens and death -- chose death because he felt that he could not
+exist outside of his own community. His most famous student was Plato, who
+started an Academy (Socrates just wandered around from hand to mouth), took
+in students (one of whom was Aristotle). If you're scientists or engineers,
+you've been taught to make fun of Aristotle, since he said that heavier
+objects fall faster than light objects, and famously, Galileo took two
+objects, dropped them, and they hit the ground at the same time.
+
+It's true that some of the things Aristotle said about the physical world
+have turned out not to be right. But it's important to understand it in
+terms of the physical world, he did not have the modern idea of trying to
+make a universal theory that explained everything.
+
+Objects falling in atmosphere with friction different from behavior of
+planets orbiting sun? Perfectly fine with Aristotle.
+
+One of the things Aristotle knew? When you see a plate of donuts, you know
+perfectly well that it's just carbs and fat and you shouldn't eat them, but
+you do anyway. Socrates explains that as "you don't really know through and
+through that it is bad for you", and Aristotle doesn't like that
+explanation. Knowing what to do and actually doing it are two different
+things. Took that in two directions: action syllogism (transitivity),
+extended so that conclusion of the syllogism can be an action. Not
+important to us: important to us is that he introduces the idea of
+virtues. A virtue is not an understanding of what's right, but a habit --
+like a good habit you get into.
+
+Aristotle lists a bunch of virtues, and in all cases he describes it as a
+midpoint between two extremes (e.g. courage between cowardice and
+foolhardiness, or honesty as a middle ground between dishonesty and saying
+too much).
+
+Better have good habits, since you don't have time in real crises to
+think. So Aristotle's big on habits. And he says that you learn the virtues
+through being a member of a community and through the role you play in that
+community, Lived in a time that people inherited roles a lot. The argument
+goes a little like this. What does it mean to be a good person? Hard
+question. What does it mean to be a good carpenter? Much easier. A good
+carpenter builds stuff that holds together and looks nice, etc. What are
+the virtues that lead to being a good carpenter? Also easy: patience, care,
+measurement, honesty, etc. Much easier than what's a good
+person.
+
+Aristotle's going to say that the virtues of being a good person are
+precisely the virtues you learn in social practices from people older than
+you who are masters of the practice. One remnant of that in modern society
+is martial arts instruction. When you go to a martial arts school and say
+you want to learn, one of the first things you learn is respect for your
+instructor, and you're supposed to live your life in a disciplined way, and
+you're not learning skills so much as habits. Like what Aristotle'd say
+about any practice. Not so much of that today: when you're learning to be a
+computer scientist, there isn't a lot of instruction in "here are the
+habits that make you a (morally) good computer scientist".
+
+Kant was not a communitarian: was more of "we can figure out the right
+answer to ethical dilemmas." He has an axiom system, just like in
+mathematics: with small numbers of axioms, you can prove things. Claims
+just one axiom, which he describes in multiple ways.
+
+Categorical imperative number one: treat people as ends, not means. This is
+the grown-up version of the golden rule. Contracts are all right as long as
+both parties have their needs met and exchange is not too unequal.
+
+Second version: universalizability. An action is good if it is
+universalizable. That means, if everybody did it, would it work? Textbook
+example is "you shouldn't tell lies". The only reason telling lies works is
+because people usually tell the truth, and so people are predisposed to
+thinking that it's usually true. If everyone told lies, then we'd be
+predisposed to disbelieve statements. Lying would no longer be effective.
+
+There's a third one which BH can never remember which is much less
+important. Kant goes on to prove theorems to resolve moral dilemmas.
+
+Problem from Kant: A runs past you into the house. B comes up with a gun
+and asks you where A is. Kant suggests something along the lines of
+misleading B.
+
+Axiomatic, resolve ethical problems through logic and proving what you want
+to do. Very popular among engineers, mainly for the work of Rawls, who
+talks about the veil of ignorance. You have to imagine yourself, looking at
+life on Earth, and not knowing in what social role you're going to be
+born. Rawls thinks that from this perspective, you have to root for the
+underdog when situations come up, because in any particular thing that
+comes up, harm to the rich person is going to be less than the gains of the
+poor person (in terms of total wealth, total needs). Going to worry about
+being on side of underdog, etc. More to Rawls: taking into account how
+things affect all different constituencies.
+
+Another descendant of Plato are utilitarians. One of the reasons it's
+important for you to understand this chart: when you don't think about it
+too hard, you use utilitarian principles, which is sometimes
+bad. Utilitarians talk about the greatest good for the greatest number.
+
+Back to something from this class: what if I illegally download some movie?
+Is that okay? How much do I benefit, and how much is the movie-maker
+harmed? Not from principled arguments, which is what Kant wants you to do,
+but from nuts and bolts, who benefits how much, each way.
+
+Putting that in a different fashion, Kantians are interested in what
+motivates your action, why you did it. Utilitarians are interested in the
+result of your action. One thing that makes utilitarian hard is that you
+have to guess as to what probably will happen.
+
+Now I want to talk to you about MacIntyre. Gave you a lot of reading,
+probably hardest reading in the course. Talks like a philosopher. Uses
+dessert as what you deserve (noun of deserve). Life-changing for BH when he
+came across MacIntyre; passing it on to you as a result.
+
+He starts by saying to imagine an aftermath in which science is blamed and
+destroyed. A thousand years later, some people digging through the remains
+of our culture read about this word science, and it's all about
+understanding how the physical world works, and they want to revive this
+practice. Dig up books by scientists, read and memorize bits of them,
+analyze, have discussions. The people who do this call themselves
+scientists because they're studying science.
+
+We from our perspective would say that isn't science at all -- you don't
+just engage with books, but rather engage with the physical world through
+experiments. Those imagined guys from a millennium from now have lost the
+practice. They think they're following a practice, but they have no idea
+what it's like. MacIntyre argues this is us with ethics.
+
+Equivalent to WW3 according to MacIntyre is Kant. Kant really, more than
+anyone else, brought into being the modern era. Why? Because in the times
+prior to Kant, a lot of arguments not only about ethics but also by the
+physical world were resolved by religious authority. Decisions made based
+on someone's interpretation of the bible, e.g.
+
+Kant claims to be a Christian, but he thinks the way we understand God's
+will is by applying the categorical imperative. Instead of asking a priest
+what to do, we reason it out. We don't ask authorities, we work it out.
+Also, he starts this business of ethical dilemmas. Everybody in the top
+half of the world talks in terms of the good life. Even Socrates, who
+thinks you can know what to do, talks about the good life, too. So ethics
+is not about "what do I do in this situation right now", but rather the
+entirety of one's life and what it means to live a good life.
+
+Kant and Mill: no sense of life as a flow; rather, moments of
+decisions. What MacIntyre calls the ethical equivalent of WW3: at that
+point, we lost the thread, since we stopped talking about the good
+life. Now, it wasn't an unmitigated disaster, since it gives us -- the
+modern liberal society, not in the American sense of voting for democrats,
+but in the sense that your life goals are up to you as an individual, and
+the role of society is to build infrastructure and getting in people's way,
+so stopping people from doing things. I can, say, have some sexual practice
+different from yours. So that was a long time coming. Now, in our
+particular culture, the only thing that's bad is having sex with children,
+as far as I can tell -- as long as it doesn't involve you messing up
+someone else's life, e.g. rape. As long as it involves two (or more?)
+consenting adults, that's okay.
+
+MacIntyre says that there are things that came up with Kant that we can't
+just turn back to being Aristotlean. The people who lived the good life
+were male Athenian citizens. They had wives who weren't eligible, and they
+had slaves who did most of the grunt work. And so male Athenian citizens
+could spend their time walking around chatting with Socrates because they
+were supported by slavery. And nobody wants to go back to that. No real way
+to go back to being Aristotlean without giving up modern civil rights.
+
+So. One of the things I really like about MacIntyre is the example of
+wanting to teach a child how to play chess, but he's not particularly
+interested. He is, however, interested in candy. You say, every time you
+play with me, I'll give you a piece of candy. If you win, two pieces. Will
+play in a way that's difficult but possible to beat me. So, MacIntyre says
+this child is now motivated to play and to play well. But he's also
+motivated to cheat, if he can get away with it. So let's say this
+arrangement goes on for some time, and the kid gets better at it. What you
+hope is that the child reaches a point where the game is valuable to
+itself: he or she sees playing chess as rewarding (as an intellectual
+challenge). When that happens, cheating becomes self-defeating.
+
+While the child is motivated by external goods (rewards, money, fame,
+whatever), then the child is not part of the community of practice. But
+once the game becomes important (the internal benefits motivate him), then
+he does feel like part of the community. Huge chess community with
+complicated infrastructure with rating, etc. And that's a community with
+practice, and it has virtues (some of which are unique to chess, but maybe
+not -- e.g. planning ahead). Honesty, of course; patience; personal
+improvement.
+
+And the same is true with most things that human beings do. Not
+everything. MacIntyre raises the example of advertising. What are the
+virtues of this practice? Well, appealing to people in ways that they don't
+really see; suggesting things that aren't quite true without saying
+them. He lists several virtues that advertising people have, and these
+virtues don't generalize. Not part of being a good person; not even
+compatible with being a good person. So different from virtues of normal
+practices.
+
+Having advertising writers is one of the ways in which MacIntyre thinks
+we've just lost the thread. The reason we have them is that we hold up in
+our society the value of furthering your own ambition and getting rich, and
+not getting rich by doing something that's good anyway, but just getting
+rich. That's an external motivation rather than an internal one.
+
+We talk about individuals pursuing their own ends. We glorify -- take as an
+integral part of our society -- as individuals pursuing their own ends. In
+a modern understanding of ethics, you approach each new situation as if
+you've never done anything. You don't learn from experience; you learn from
+rules. The result may be the same for each intermediate situation, but it
+leads to you thinking differently. You don't think about building good
+habits in this context.
+
+A lot of you probably exercise (unlike me). Maybe you do it because it's
+fun, but maybe you also do it because it only gets harder as you get older,
+and you should get in the habit to keep it up. In that area, you get into
+habits. But writing computer programs, we tell you about rules (don't have
+concurrency violations), and I guess implicitly, we say that taking 61B is
+good for you because you learn to write bigger programs. Still true --
+still a practice with virtues.
+
+Two things: that sort of professional standard of work is a pretty narrow
+ethical issue. They don't teach you to worry about the privacy implications
+of third parties. Also, when people say they have an ethical dilemma, they
+think about it as a decision. A communitarian would reject all that ethical
+dilemma stuff. Dilemmas will have bad outcomes regardless. Consider Greek
+tragedies. When Oedipus finds himself married to his mother, it's like game
+over. Whole series of bad things that happen to him. Not much he can do
+about it on an incident by incident basis. Problem is a fatal flaw in his
+character early on (as well as some ignorance), and no system of ethics is
+going to lead Oedipus out of this trap. What you have to is try not to get
+into traps, and you do that through prudence and honesty and whatnot.
+
+Classic dilemma: Heins is a guy whose wife has a fatal disease that can be
+cured by an expensive drug, but Heins is poor. So he goes to the druggist
+and says that he can't afford to pay for this drug, but his wife is going
+to die, so the druggist says no. So Heins is considering breaking into the
+drugstore at night and stealing the drug so his wife can live. What should
+he do and why? According to the literature, there's no right answer. What
+matters is your reason.
+
+I'm going to get this wrong, but it's something like this. Stage one: your
+immediate needs are what matter. Yes, he should steal it, because it's his
+wife, or no, he shouldn't steal it, because he should go to prison. Stage
+two: something like worrying about consequences to individuals. Might hurt
+druggist or might hurt his wife. Stage three: something like "well, I have
+a closer relationship to my wife than the druggist; I care more about my
+wife, so I should steal it". Stage four: it's against the law, and I
+shouldn't break the law. Stage five: like stage three, generalized to
+larger community: how much will it hurt my wife not to get the drug? A
+lot. How much will it hurt the druggist if I steal it? Some money. Stage
+six, based not on laws of community, but rather on the standards of the
+community. Odd-numbered stages are about specific people. Even-numbered
+stages are about society and rules (punishment if I do it to it's the law
+to it's what people expect of me).
+
+Right now I'm talking about the literature of moral psychology: people go
+through these stages (different ways of thinking). Question posed is not
+"how do people behave", but rather "how should people behave".
+
+This is modern ethical reasoning. Take some situation that has no right
+answer, and split hairs about finding a right answer somehow.
+
+Talk about flying: checklist for novices. Instructors don't use this list:
+eventually, you get to where you're looking at the entire dashboard at
+once, and things that aren't right jump out at you.
+
+Another example: take a bunch of chess pieces, put them on the board, get
+someone to look at it for a minute, and take the pieces away, and ask the
+person to reconstruct the board position. Non-chess players are terrible
+(unsurprisingly); chess grandmasters can do it if it came out of a real
+game; if you put it randomly, they're just as bad as the rest of
+us. They're not looking at individual pieces; they're looking at the board
+holistically (clusters of pieces that interact with each other).
+
+Relevance to this about ethics: we don't always know why we do things. Very
+rare that we have the luxury to figure out either what categorical
+imperative tells us or utilitarian approach. Usually we just do something.
+
+BH with weaknesses. Would be stronger if his education was less about
+thinking things through and more about doing the right thing.
+
+Our moral training is full of "Shalt Not"s. Lot more in the Bible about
+what not to do than what to do or how to live the good life (that part of
+the Bible -- gets better). We also have these laws. Hardly ever say you
+have to do something (aside from paying taxes). Mostly say what you can't
+do. Never say how to live the good life. BH thinks that serves us ill. Have
+to make decisions. Often, what you do is different from what you say you
+should do.
View
400 fa2012/cs_h195/cs_h195.md
@@ -454,5 +454,403 @@ CS H195: Ethics with Harvey
===========================
September 17, 2012
------------------
-
Lawsuit to get records about NSA's surveillance information.
+
+Video games affecting people, evidently.
+
+Government subpoenaed Twitter to give people tweets.
+
+Records can be subpoenaed in a court case, etc. We'll see how this plays
+out. Today, in today's Daily Cal, UCB suing big companies. Universities do
+research, etc. Back in the day, core memory meant people paid money to IBM
+and MIT. Berkeley holds a bunch of patents. Non-software seems reasonable.
+
+Important point: the burst of genius is very rarely true. Enabling
+technologies have reached the point of making things feasible. Usual story
+about inventions. Flash bulb in a camera, single-use: before sustainable
+light bulb. Steam engine. Some inventions aren't like that. Some really do
+just come to somebody (velcro, xerography). Nobody else was working on
+that. More often, everyone is thinking about this stuff.
+
+IP. A patent is the right to develop an invention, to produce things
+dependent on an invention. Copyright is not about invention, it's about
+creative and artistic works. And there, if you have an idea and write about
+it, other people are allowed to use your ideas, not your words. Trademark,
+you know what it is; you can register one; people are not allowed to use it
+in ways that might confuse people. You can in principle make a vacuum
+cleaner called "time". How close do things have to be to raise a lawsuit?
+Lawsuit about Apple Computers vs Apple Records. Later did, which caused a
+later round of battling.
+
+Personal likeness, I can't take a picture of you and publish it with
+certain exceptions. Most important for famous people. Funny rules:
+newsworthy, and news photographers are allowed to take pictures of
+newsworthy people.
+
+Trade secrets: if a company has secrets, and you are a competing company,
+you may not send a spy to extract these secrets.
+
+House ownership. There are houses where people have had houses for
+millennia. Patents and copyrights are not like that: not a right. Those
+things are bargains between creators and society. Purpose to society is
+that these eventually belong to the public. One of the readings talks about
+a different history of patents quoting Italian legal scholars, and if
+correct, patents were supposed to be permanent ownership. Why might it be
+good to society? Used to be people who made new inventions. Guilds. Hard to
+join, and you would be a slave for a while. Master would teach apprentice
+the trade, and advantage was that it reduced competition. Trouble was that
+there is a long history of things people used to be able to do that we
+can't anymore. Textbook example: Stradivarius violins.
+
+Nonetheless, nobody knows how Stradivarius made violins. Stories about how
+to make paints of particular colors. What the patent system is trying to
+avoid. Describe how invention works so someone in the field can create
+it. By making this disclosure, you are given a limited-term exclusive right
+to make these.
+
+The thing is, sooner or later, your technology is going to be obsolete. To
+your advantage to have a clear legal statement.
+
+Patent treaties. Used to be that if you invented something important, you'd
+hire a bunch of lawyers.
+
+Until recently, software was not patentable. ATT wanted to patent the
+SETUID bit. In those days, you could not patent any math or software or
+algorithm.
+
+Patents stifling innovation in the field. When you file a patent
+application. Let's say you deny the patent. You would like to fall back on
+trade secrecy. Patent applications are secret until approved. Startups
+doomed. Wouldn't matter if term were short compared to innovation cycle of
+the industry.
+
+Another thing in the Constitution is that treaties take precedence over
+domestic laws.
+
+So let's talk about copyrights! So. Nobody says let's do away with
+copyright altogether. Copyright (at its worst) is less socially harmful
+than patents because it's so specific. Again, copyrights are a
+bargain. Started in Britain between the King and printers. Printers wanted
+exclusive right to things they printed. King wanted printers to be
+censors. Originally not authors who had copyright, but the publisher. Often
+creators of rights will sell the rights to publishers.
+
+This is where computers come in. How to sell to world? Used to need big
+company with facilities to create copies and widely
+distribute. Self-publish: work available to everyone. Important: rarely
+author who complains about copyrights. Usually publishers.
+
+There's always been piracy, but limited historically by analog media losing
+information when copying.
+
+Term of copyright has gotten longer and longer. Lawsuit about this about
+the most recent extension. In effect, making permanent copyright, against
+constitution. Ironic because copyright law now would have made much of
+what made Disney rich would have been copyrighted. Lot of exceptions to
+copyright law. Fair use. e.g. cannot write a Harry Potter novel, but can
+write a Harry Potter parody. Famous case: Gone with the Wind. About how
+wonderful life was for the owners of slaves. Someone wrote a book
+(retelling from slave's point of view); ruled as fair use (political
+commentary, protected by free speech).
+
+Stallman actually invented a system that has 5 different categories of
+work. Even Stallman doesn't say to ditch copyright. Hardly any musicians
+make any money selling music because their contracts say that they make a
+certain percentage of net proceeds. The way musicians survive is concerts,
+and ironically, selling concert CDs. Stallman says to make music players
+have a money button and send money directly to the musician.
+
+<a name='4'></a>
+
+CS H195: Ethics with Harvey
+===========================
+September 24, 2012
+------------------
+Vastly oversimplified picture of moral philosophy. Leaves out a lot.
+
+So Socrates says famously "to know the good is to desire the good", by
+which he means that if you really understand what's in your own interest,
+it's going to turn out to be the right thing. Counter-intuitive, since
+we've probably encountered situations in which we think what's good for us
+isn't good for the rest of the community.
+
+Ended up convicting Socrates, and he was offered the choice between exile
+from Athens and death -- chose death because he felt that he could not
+exist outside of his own community. His most famous student was Plato, who
+started an Academy (Socrates just wandered around from hand to mouth), took
+in students (one of whom was Aristotle). If you're scientists or engineers,
+you've been taught to make fun of Aristotle, since he said that heavier
+objects fall faster than light objects, and famously, Galileo took two
+objects, dropped them, and they hit the ground at the same time.
+
+It's true that some of the things Aristotle said about the physical world
+have turned out not to be right. But it's important to understand it in
+terms of the physical world, he did not have the modern idea of trying to
+make a universal theory that explained everything.
+
+Objects falling in atmosphere with friction different from behavior of
+planets orbiting sun? Perfectly fine with Aristotle.
+
+One of the things Aristotle knew? When you see a plate of donuts, you know
+perfectly well that it's just carbs and fat and you shouldn't eat them, but
+you do anyway. Socrates explains that as "you don't really know through and
+through that it is bad for you", and Aristotle doesn't like that
+explanation. Knowing what to do and actually doing it are two different
+things. Took that in two directions: action syllogism (transitivity),
+extended so that conclusion of the syllogism can be an action. Not
+important to us: important to us is that he introduces the idea of
+virtues. A virtue is not an understanding of what's right, but a habit --
+like a good habit you get into.
+
+Aristotle lists a bunch of virtues, and in all cases he describes it as a
+midpoint between two extremes (e.g. courage between cowardice and
+foolhardiness, or honesty as a middle ground between dishonesty and saying
+too much).
+
+Better have good habits, since you don't have time in real crises to
+think. So Aristotle's big on habits. And he says that you learn the virtues
+through being a member of a community and through the role you play in that
+community, Lived in a time that people inherited roles a lot. The argument
+goes a little like this. What does it mean to be a good person? Hard
+question. What does it mean to be a good carpenter? Much easier. A good
+carpenter builds stuff that holds together and looks nice, etc. What are
+the virtues that lead to being a good carpenter? Also easy: patience, care,
+measurement, honesty, etc. Much easier than what's a good
+person.
+
+Aristotle's going to say that the virtues of being a good person are
+precisely the virtues you learn in social practices from people older than
+you who are masters of the practice. One remnant of that in modern society
+is martial arts instruction. When you go to a martial arts school and say
+you want to learn, one of the first things you learn is respect for your
+instructor, and you're supposed to live your life in a disciplined way, and
+you're not learning skills so much as habits. Like what Aristotle'd say
+about any practice. Not so much of that today: when you're learning to be a
+computer scientist, there isn't a lot of instruction in "here are the
+habits that make you a (morally) good computer scientist".
+
+Kant was not a communitarian: was more of "we can figure out the right
+answer to ethical dilemmas." He has an axiom system, just like in
+mathematics: with small numbers of axioms, you can prove things. Claims
+just one axiom, which he describes in multiple ways.
+
+Categorical imperative number one: treat people as ends, not means. This is
+the grown-up version of the golden rule. Contracts are all right as long as
+both parties have their needs met and exchange is not too unequal.
+
+Second version: universalizability. An action is good if it is
+universalizable. That means, if everybody did it, would it work? Textbook
+example is "you shouldn't tell lies". The only reason telling lies works is
+because people usually tell the truth, and so people are predisposed to
+thinking that it's usually true. If everyone told lies, then we'd be
+predisposed to disbelieve statements. Lying would no longer be effective.
+
+There's a third one which BH can never remember which is much less
+important. Kant goes on to prove theorems to resolve moral dilemmas.
+
+Problem from Kant: A runs past you into the house. B comes up with a gun
+and asks you where A is. Kant suggests something along the lines of
+misleading B.
+
+Axiomatic, resolve ethical problems through logic and proving what you want
+to do. Very popular among engineers, mainly for the work of Rawls, who
+talks about the veil of ignorance. You have to imagine yourself, looking at
+life on Earth, and not knowing in what social role you're going to be
+born. Rawls thinks that from this perspective, you have to root for the
+underdog when situations come up, because in any particular thing that
+comes up, harm to the rich person is going to be less than the gains of the
+poor person (in terms of total wealth, total needs). Going to worry about
+being on side of underdog, etc. More to Rawls: taking into account how
+things affect all different constituencies.
+
+Another descendant of Plato are utilitarians. One of the reasons it's
+important for you to understand this chart: when you don't think about it
+too hard, you use utilitarian principles, which is sometimes
+bad. Utilitarians talk about the greatest good for the greatest number.
+
+Back to something from this class: what if I illegally download some movie?
+Is that okay? How much do I benefit, and how much is the movie-maker
+harmed? Not from principled arguments, which is what Kant wants you to do,
+but from nuts and bolts, who benefits how much, each way.
+
+Putting that in a different fashion, Kantians are interested in what
+motivates your action, why you did it. Utilitarians are interested in the
+result of your action. One thing that makes utilitarian hard is that you
+have to guess as to what probably will happen.
+
+Now I want to talk to you about MacIntyre. Gave you a lot of reading,
+probably hardest reading in the course. Talks like a philosopher. Uses
+dessert as what you deserve (noun of deserve). Life-changing for BH when he
+came across MacIntyre; passing it on to you as a result.
+
+He starts by saying to imagine an aftermath in which science is blamed and
+destroyed. A thousand years later, some people digging through the remains
+of our culture read about this word science, and it's all about
+understanding how the physical world works, and they want to revive this
+practice. Dig up books by scientists, read and memorize bits of them,
+analyze, have discussions. The people who do this call themselves
+scientists because they're studying science.
+
+We from our perspective would say that isn't science at all -- you don't
+just engage with books, but rather engage with the physical world through
+experiments. Those imagined guys from a millennium from now have lost the
+practice. They think they're following a practice, but they have no idea
+what it's like. MacIntyre argues this is us with ethics.
+
+Equivalent to WW3 according to MacIntyre is Kant. Kant really, more than
+anyone else, brought into being the modern era. Why? Because in the times
+prior to Kant, a lot of arguments not only about ethics but also by the
+physical world were resolved by religious authority. Decisions made based
+on someone's interpretation of the bible, e.g.
+
+Kant claims to be a Christian, but he thinks the way we understand God's
+will is by applying the categorical imperative. Instead of asking a priest
+what to do, we reason it out. We don't ask authorities, we work it out.
+Also, he starts this business of ethical dilemmas. Everybody in the top
+half of the world talks in terms of the good life. Even Socrates, who
+thinks you can know what to do, talks about the good life, too. So ethics
+is not about "what do I do in this situation right now", but rather the
+entirety of one's life and what it means to live a good life.
+
+Kant and Mill: no sense of life as a flow; rather, moments of
+decisions. What MacIntyre calls the ethical equivalent of WW3: at that
+point, we lost the thread, since we stopped talking about the good
+life. Now, it wasn't an unmitigated disaster, since it gives us -- the
+modern liberal society, not in the American sense of voting for democrats,
+but in the sense that your life goals are up to you as an individual, and
+the role of society is to build infrastructure and getting in people's way,
+so stopping people from doing things. I can, say, have some sexual practice
+different from yours. So that was a long time coming. Now, in our
+particular culture, the only thing that's bad is having sex with children,
+as far as I can tell -- as long as it doesn't involve you messing up
+someone else's life, e.g. rape. As long as it involves two (or more?)
+consenting adults, that's okay.
+
+MacIntyre says that there are things that came up with Kant that we can't
+just turn back to being Aristotlean. The people who lived the good life
+were male Athenian citizens. They had wives who weren't eligible, and they
+had slaves who did most of the grunt work. And so male Athenian citizens
+could spend their time walking around chatting with Socrates because they
+were supported by slavery. And nobody wants to go back to that. No real way
+to go back to being Aristotlean without giving up modern civil rights.
+
+So. One of the things I really like about MacIntyre is the example of
+wanting to teach a child how to play chess, but he's not particularly
+interested. He is, however, interested in candy. You say, every time you
+play with me, I'll give you a piece of candy. If you win, two pieces. Will
+play in a way that's difficult but possible to beat me. So, MacIntyre says
+this child is now motivated to play and to play well. But he's also
+motivated to cheat, if he can get away with it. So let's say this
+arrangement goes on for some time, and the kid gets better at it. What you
+hope is that the child reaches a point where the game is valuable to
+itself: he or she sees playing chess as rewarding (as an intellectual
+challenge). When that happens, cheating becomes self-defeating.
+
+While the child is motivated by external goods (rewards, money, fame,
+whatever), then the child is not part of the community of practice. But
+once the game becomes important (the internal benefits motivate him), then
+he does feel like part of the community. Huge chess community with
+complicated infrastructure with rating, etc. And that's a community with
+practice, and it has virtues (some of which are unique to chess, but maybe
+not -- e.g. planning ahead). Honesty, of course; patience; personal
+improvement.
+
+And the same is true with most things that human beings do. Not
+everything. MacIntyre raises the example of advertising. What are the
+virtues of this practice? Well, appealing to people in ways that they don't
+really see; suggesting things that aren't quite true without saying
+them. He lists several virtues that advertising people have, and these
+virtues don't generalize. Not part of being a good person; not even
+compatible with being a good person. So different from virtues of normal
+practices.
+
+Having advertising writers is one of the ways in which MacIntyre thinks
+we've just lost the thread. The reason we have them is that we hold up in
+our society the value of furthering your own ambition and getting rich, and
+not getting rich by doing something that's good anyway, but just getting
+rich. That's an external motivation rather than an internal one.
+
+We talk about individuals pursuing their own ends. We glorify -- take as an
+integral part of our society -- as individuals pursuing their own ends. In
+a modern understanding of ethics, you approach each new situation as if
+you've never done anything. You don't learn from experience; you learn from
+rules. The result may be the same for each intermediate situation, but it
+leads to you thinking differently. You don't think about building good
+habits in this context.
+
+A lot of you probably exercise (unlike me). Maybe you do it because it's
+fun, but maybe you also do it because it only gets harder as you get older,
+and you should get in the habit to keep it up. In that area, you get into
+habits. But writing computer programs, we tell you about rules (don't have
+concurrency violations), and I guess implicitly, we say that taking 61B is
+good for you because you learn to write bigger programs. Still true --
+still a practice with virtues.
+
+Two things: that sort of professional standard of work is a pretty narrow
+ethical issue. They don't teach you to worry about the privacy implications
+of third parties. Also, when people say they have an ethical dilemma, they
+think about it as a decision. A communitarian would reject all that ethical
+dilemma stuff. Dilemmas will have bad outcomes regardless. Consider Greek
+tragedies. When Oedipus finds himself married to his mother, it's like game
+over. Whole series of bad things that happen to him. Not much he can do
+about it on an incident by incident basis. Problem is a fatal flaw in his
+character early on (as well as some ignorance), and no system of ethics is
+going to lead Oedipus out of this trap. What you have to is try not to get
+into traps, and you do that through prudence and honesty and whatnot.
+
+Classic dilemma: Heins is a guy whose wife has a fatal disease that can be
+cured by an expensive drug, but Heins is poor. So he goes to the druggist
+and says that he can't afford to pay for this drug, but his wife is going
+to die, so the druggist says no. So Heins is considering breaking into the
+drugstore at night and stealing the drug so his wife can live. What should
+he do and why? According to the literature, there's no right answer. What
+matters is your reason.
+
+I'm going to get this wrong, but it's something like this. Stage one: your
+immediate needs are what matter. Yes, he should steal it, because it's his
+wife, or no, he shouldn't steal it, because he should go to prison. Stage
+two: something like worrying about consequences to individuals. Might hurt
+druggist or might hurt his wife. Stage three: something like "well, I have
+a closer relationship to my wife than the druggist; I care more about my
+wife, so I should steal it". Stage four: it's against the law, and I
+shouldn't break the law. Stage five: like stage three, generalized to
+larger community: how much will it hurt my wife not to get the drug? A
+lot. How much will it hurt the druggist if I steal it? Some money. Stage
+six, based not on laws of community, but rather on the standards of the
+community. Odd-numbered stages are about specific people. Even-numbered
+stages are about society and rules (punishment if I do it to it's the law
+to it's what people expect of me).
+
+Right now I'm talking about the literature of moral psychology: people go
+through these stages (different ways of thinking). Question posed is not
+"how do people behave", but rather "how should people behave".
+
+This is modern ethical reasoning. Take some situation that has no right
+answer, and split hairs about finding a right answer somehow.
+
+Talk about flying: checklist for novices. Instructors don't use this list:
+eventually, you get to where you're looking at the entire dashboard at
+once, and things that aren't right jump out at you.
+
+Another example: take a bunch of chess pieces, put them on the board, get
+someone to look at it for a minute, and take the pieces away, and ask the
+person to reconstruct the board position. Non-chess players are terrible
+(unsurprisingly); chess grandmasters can do it if it came out of a real
+game; if you put it randomly, they're just as bad as the rest of
+us. They're not looking at individual pieces; they're looking at the board
+holistically (clusters of pieces that interact with each other).
+
+Relevance to this about ethics: we don't always know why we do things. Very
+rare that we have the luxury to figure out either what categorical
+imperative tells us or utilitarian approach. Usually we just do something.
+
+BH with weaknesses. Would be stronger if his education was less about
+thinking things through and more about doing the right thing.
+
+Our moral training is full of "Shalt Not"s. Lot more in the Bible about
+what not to do than what to do or how to live the good life (that part of
+the Bible -- gets better). We also have these laws. Hardly ever say you
+have to do something (aside from paying taxes). Mostly say what you can't
+do. Never say how to live the good life. BH thinks that serves us ill. Have
+to make decisions. Often, what you do is different from what you say you
+should do.
View
124 fa2012/ee221a/10.md
@@ -0,0 +1,124 @@
+EE 221A: Linear System Theory
+=============================
+September 25, 2012
+------------------
+Linear time-varying systems
+---------------------------
+Recall the state transition function is given some function of the current
+time with initial state, initial time, and inputs, Suppose you have a
+differential equation; how do you acquire the state transition function?
+Solve the differential equation.
+
+For a general dynamical system, there are different ways to get the state
+transition function. This is an instantiation of a dynamical system, and
+we're going to ge thte state transition function by solving the
+differential equation / initial condition pair.
+
+We're going to call $\dot{x}(t) = A(t)x(t) + B(t)u(t)$ a vector
+differential equation with initial condition $x(t_0) = x_0$.
+
+So that requires us to think about solving that differential equation. Do a
+dimension check, to make sure we know the dimensions of the matrices. $x
+\in \Re^n$, so $A \in \Re^{n_0 \times n}$. We could define the matrix
+function $A$, which takes intervals of the real line and maps them over to
+matrices. As a function, $A$ is piecewise continuous matrix function in
+time.
+
+The entries are piecewise-continuous scalars in time. We would like to get
+at the state transition function; to do that, we need to solve the
+differential equation.
+
+Let's assume for now that $A, B, U$ are given (part of the system
+definition).
+
+Piece-wise continuous is trivial; we can use the induced norm of $A$ for a
+Lipschitz condition. Since this induced norm is piecewise-continuous in
+time, this is a fine bound. Therefore $f$ is globally Lipschitz continuous.
+
+We're going to back off for a bit and introduce the state transition
+matrix. Background for solving the VDE. We're going to introduce a matrix
+differential equation, $\dot{X} = A(t) X$ (where $A(t)$ is same as before).
+
+I'm going to define $\Phi(t, t_0)$ as the solution to the matrix
+differential equation (MDE) for the initial condition $\Phi(t_0, t_0) =
+1_{n \times n}$. I'm going to define $\Phi$ as the solution to the $n
+\times n$ matrix when my differential equation starts out in the identity
+matrix.
+
+Let's first talk about properties of this matrix $\Phi$ just from the
+definition we have.
+
+ * If you go back to the vector differential equation, and let's just drop
+ the term that depends on $u$ (either consider $B$ to be 0, or the input
+ to be 0), the solution of $\cdot{x} = A(t)x(t)$ is given by $x(t) =
+ \Phi(t, t_0)x_0$.
+ * This is what we call the semigroup property, since it's reminiscent of
+ the semigroup axiom. $\Phi(t, t_0) = \Phi(t, t_1) \Phi(t_1, t_0) \forall
+ t, t_0, t_1 \in \Re^+$
+ * $\Phi^{-1}(t, t_0) = \Phi(t_0, t)$.
+ * $\text{det} \Phi(t, t_0) = \exp\parens{\int_{t_0}^t \text{tr} \parens{A
+ (\tau)} d\tau}$.
+
+Here's let's talk about some machinery we can now invoke when
+we want to show that two functions of time are equal to each other when
+they're both solutions to the differential equation. You can simply show by
+the existence and uniqueness theorem (assuming it applies) that they
+satisfy the same initial condition and the same differential
+equation. That's an important point, and we tend to use it a lot.
+
+(i.e. when faced with showing that two functions of time are equal to each
+other, you can show that they both satisfy the same initial condition and
+the same differential equation [as long as the differential equation
+satisfies the hypotheses of the existence and uniqueness theorem])
+
+Obvious, but good to state.
+
+Note: the initial condition doesn't have to be the initial condition given;
+it just has to hold at one point in the interval. Pick your point in time
+judiciously.
+
+Proof of (2): check $t=t_1$. (3) follows directly from (2). (4) you can
+look at if you want. Gives you a way to compute $\Phi(t, t_0)$. We've
+introduced a matrix differential equation and an abstract solution.
+
+Consider (1). $\Phi(t, t_0)$ is a map that takes the initial state and
+transitions to the new state. Thus we call $\Phi$ the **state transition
+matrix** because of what it does to the states of this vector differential
+equation: it transfers them from their initial value to their final value,
+and it transfers them through matrix multiplication.
+
+Let's go back to the original differential equation. Claim that the
+solution to that differential equation has the following form: $x(t) =
+\Phi(t, t_0)x_0 + \int_{t_0}^t \Phi(t, \tau)B(\tau)u(\tau) d\tau$. Proof:
+we can use the same machinery. If someone gives you a candidate solution,
+you can easily show that it is the solution.
+
+Recall the Leibniz rule, which we'll state in general as follows:
+$\pderiv{}{z} \int_{a(z)}^{b^z} f(x, z) dx = \int_{a(z)}^{b^z}
+\pderiv{}{x}f(x, z) dx + \pderiv{b}{z} f(b, z) - \pderiv{a}{z} f(a, z}$.
+
+$$
+\dot{x}(t) & = A(t) \Phi(t, t_0) x_0 + \int_{t_0}^t
+\pderiv{}{t} \parens{\Phi(t, \tau)B(\tau)u(\tau)} d\tau +
+\pderiv{t}{t}\parens{\Phi(t, t)B(t)u(t)} - \pderiv{t_0}{t}\parens{...}
+\\ & = A(t)\Phi(t, t_0)x_0 + \int_{t_0}^t A(t)\Phi(t,\tau)B(\tau)u(\tau)d\tau + B(t)u(t)
+\\ & = A(\tau)\Phi(t, t_0) x_0 + A(t)\int_{t_0}^t \Phi(t, \tau)B(\tau)
+u(\tau) d\tau + B(t) u(t)
+\\ & = A(\tau)\parens{\Phi(t, t_0) x_0 + \int_{t_0}^t \Phi(t, \tau)B(\tau)
+u(\tau) d\tau} + B(t) u(t)
+$$
+
+$x(t) = \Phi(t,t_0)x_0 + \int_{t_0}^t \Phi(t,\tau)B(\tau)u(\tau) d\tau$ is
+good to remember.
+
+
+Not surprisingly, it depends on the input function over an interval of
+time.
+
+The differential equation is changing over time, therefore the system
+itself is time-varying. No way in general that will be time-invariant,
+since the equation that defines its evolution is changing. You test
+time-invariance or time variance through the response map. But is it
+linear? You have the state transition function, so we can compute the
+response function (recall: readout map composed with the state transition
+function) and ask if this is a linear map.
View
211 fa2012/ee221a/8.md
@@ -0,0 +1,211 @@
+EE 221A: Linear System Theory
+=============================
+September 18, 2012
+------------------
+
+Today:
+
+* proof of existence and uniqueness theorem.
+* [ if time ] introduction to dynamical systems.
+
+First couple of weeks of review to build up basic concepts that we'll be
+drawing upon throughout the course. Either today or Thursday we will launch
+into linear system theory.
+
+We're going to recall where we were last time. We had the fundamental
+theorem of differential equations, which said the following: if we had a
+differential equation, $\dot{x} = f(x,t)$, with initial condition $x(t_0) =
+x_0$, where $x(t) \in \Re^n$, etc, if $f( \cdot , t)$ is Lipschitz
+continuous, and $f(x, \cdot )$ is piecewise continuous, then there exists a
+unique solution to the differential equation / initial condition pair (some
+function $\phi(t)$) wherever you can take the derivative (may not be
+differentiable everywhere: loses differentiability on the points where
+discontinuities exist).
+
+We spent quite a lot of time discussing Lipschitz continuity. Job is
+usually to test both conditions; first one requires work. We described a
+popular candidate function by looking at the mean value theorem and
+applying it to $f$: a norm of the Jacobian function provides a candidate
+Lipschitz if it works.
+
+We also described local Lipschitz continuity, and often, when using a norm
+of the Jacobian, that's fairly easy to show.
+
+Important point to recall: a norm of the Jacobian of $f$ provides a
+candidate Lipschitz function.
+
+Another important thing to say here is that we can use any norm we want, so
+we can be creative in our choice of norm when looking for a better bound.
+
+We started our proof last day, and we talked a little about the structure
+of the proof. We are going to proceed by constructing a sequence of
+functions, then show (1) that it converges to a solution, then show (2)
+that it is unique.
+
+Proof of Existence
+------------------
+We are going to construct this sequence of functions as follows:
+$x_{m+1}(t) = x_0 + \int_0^t f(x_m(\tau)) d\tau$. Here we're dealing with
+an arbitrary interval from $t_1$ to $t_2$, and so $0 \in [t_1, t_2]$. We
+want to show that this sequence is a Cauchy sequence, and we're going to
+rely on our knowledge that the space these functions are defined in is a
+Banach space (hence this sequence converges to something in the space).
+
+We have to put a norm