/
feed.json
1 lines (1 loc) · 133 KB
/
feed.json
1
{"layout":"none","permalink":"/feed/","related_posts":[],"url":"/feed/","content":"<?xml version=\"1.0\" encoding=\"utf-8\"?>\n<rss version=\"2.0\"\n\txmlns:content=\"http://purl.org/rss/1.0/modules/content/\"\n\txmlns:wfw=\"http://wellformedweb.org/CommentAPI/\"\n\txmlns:dc=\"http://purl.org/dc/elements/1.1/\"\n\txmlns:atom=\"http://www.w3.org/2005/Atom\"\n\txmlns:sy=\"http://purl.org/rss/1.0/modules/syndication/\"\n\txmlns:slash=\"http://purl.org/rss/1.0/modules/slash/\"\n\t>\n<channel>\n <title xml:lang=\"en\"></title>\n <atom:link type=\"application/atom+xml\" href=\"http://ben.balter.com/feed/\" rel=\"self\"/>\n <link>http://ben.balter.com</link>\n <pubDate>Wed, 26 Dec 2012 21:32:36 +0000</pubDate>\n <lastBuildDate>Wed, 26 Dec 2012 21:32:36 +0000</lastBuildDate>\n \t<language>en-US</language>\n <description>J.D./M.B.A. Candidate, Open-Source Developer, Gov 2.0 Evangelist</description>\n <item>\n <title>Securing the Status Quo</title>\n <link>http://ben.balter.com/2012/12/18/securing-the-status-quo/</link>\n <pubDate>Tue, 18 Dec 2012 00:00:00 +0000</pubDate>\n <dc:creator>Benjamin J. Balter</dc:creator>\n <category>FISMA</category>\n <category>Cybersecurity</category>\n <category>policy</category>\n <category>government</category>\n <category>Technology</category>\n <guid isPermaLink=\"false\">/2012/12/18/securing-the-status-quo</guid>\n <description><![CDATA[ <h1 id='the_effect_of_federal_it_security_policies_on_innovation'>The Effect of Federal IT Security Policies on Innovation</h1>\n<div class='maruku_toc'><ul style='list-style: none;'><li><a href='#i_the_current_it_security_regime'>I. The Current IT Security Regime</a><ul style='list-style: none;'><li><a href='#a_the_federal_information_security_management_act'>A. The Federal Information Security Management Act</a></li><li><a href='#b_the_privacy_act'>B. The Privacy Act</a></li><li><a href='#c_formal_guidance'>C. Formal Guidance</a></li></ul></li><li><a href='#ii_mere_security_theater'>II. Mere Security Theater</a></li><li><a href='#iii_problems_with_existing_information_security_policy'>III. Problems with Existing Information Security Policy</a><ul style='list-style: none;'><li><a href='#a_difficulty_implementing'>A. Difficulty Implementing</a></li><li><a href='#b_ambiguity'>B. Ambiguity</a></li><li><a href='#c_a_culture_of_indifference'>C. A Culture of Indifference</a></li><li><a href='#d_a_disinterested_public'>D. A Disinterested Public</a></li><li><a href='#e_lack_of_accountability'>E. Lack of Accountability</a></li></ul></li><li><a href='#iv_the_toll_on_taxpayers'>IV. The Toll on Taxpayers</a><ul style='list-style: none;'><li><a href='#a_administrative_costs'>A. Administrative Costs</a></li><li><a href='#b_opportunity_costs'>B. Opportunity Costs</a></li></ul></li><li><a href='#v_streamlining_our_nations_information_security'>V. Streamlining our Nations Information Security</a><ul style='list-style: none;'><li><a href='#a_carrots_not_sticks'>A. Carrots, not Sticks</a></li><li><a href='#b_reduce_duplication_of_efforts'>B. Reduce Duplication of Efforts</a></li><li><a href='#b_modular_administration'>B. Modular Administration</a></li><li><a href='#c_a_grace_period_for_pilot_programs'>C. A Grace Period for Pilot Programs</a></li></ul></li><li><a href='#conclusion'>Conclusion</a></li></ul></div>\n<p>The United States Federal Government is the single largest purchases of information security products.<sup id='fnref:1'><a href='#fn:1' rel='footnote'>1</a></sup> In FY 2011 alone, a mere 24 agencies<sup id='fnref:2'><a href='#fn:2' rel='footnote'>2</a></sup> reported a combined IT security budget of $13.3 billion, employing the equivalent of some 84,426 full time employees with major responsibilities for information security.<sup id='fnref:3'><a href='#fn:3' rel='footnote'>3</a></sup> The gravity of the ongoing threat to our nations cybersecurity has been described in the starkest of terms at the highest levels of government. Describing our current situation, President Obama remarked that [t]he status quo is no longer acceptable<sup id='fnref:4'><a href='#fn:4' rel='footnote'>4</a></sup>, members of congress from both parties have characterized the risk as a catastrophe in the making,<sup id='fnref:5'><a href='#fn:5' rel='footnote'>5</a></sup> and Secretary of Defense Leon Panetta warned of an imminent digital Pearl Harbor.<sup id='fnref:6'><a href='#fn:6' rel='footnote'>6</a></sup></p>\n\n<p>Yet, despite all these histrionics, successful attacks against the nations most hardened systems are perpetrated each day. In 2005, for example, a lone hacker in the United Kingdom was able to remotely render more than 300 computers at an American Naval Station completely inoperable, by simply deleting a single file.<sup id='fnref:7'><a href='#fn:7' rel='footnote'>7</a></sup> In 2011, foreign intruders gained access to more than 24,000 files including those concerning the military’s most sensitive systems surveillance technologies, satellite communications systems, and network security protocols,<sup id='fnref:8'><a href='#fn:8' rel='footnote'>8</a></sup> and as recently as October of 2012, the Chinese government attempted to infect computers within the White House office responsible for maintaining the Presidents nuclear launch codes.<sup id='fnref:9'><a href='#fn:9' rel='footnote'>9</a></sup></p>\n\n<p>At the same time, the government computer systems that agencies rely on for many internal processes are often years behind the technology readily available to consumers who can run entire websites form the smartphone resting within their pocket. Whereas once the federal government pushed the capabilities of information systems — using the first computers to calculate projectile trajectories during the cold war or interconnecting the first machines to give rise to what is today the Internet — today government and innovation are far from synonymous. Over the past few decades the IT industry has undergone a radical transformation towards consumerization, a transformation that has largely left the public sector behind. Whereas once deploying a technology solution would require months of upfront planning and investment, today lean, iterative, and decentralized solutions dominate the marketplace — technologies ill suited for adoption within the governments paperwork-laden security regime.</p>\n\n<p>Current government security restrictions rightful seek to protect large, mission-critical systems. Yet such efforts often come at the cost of significantly hindering the adoption of smaller, less vital system, many of which — such as blogs or social media tools — can have a greater impact on the day-to-day life of average Americans. The availability of cheaper, leaner solutions, now commonplace in the private sector, can not only provide government with a unique opportunity to meet citizens demands to do more with less, but can also empower it to expand service offerings into new verticals such as transacting additional citizen services online, or offering greater transparency within existing offerings. To unlock the potential of emerging technologies, security requirements for pilot programs should be relaxed so that agencies are empowered to follow private sector best practices to rapidly bring prototypical solutions to market and are given the freedom to tap existing third-party, often free services.</p>\n<!-- more -->\n<h2 id='i_the_current_it_security_regime'>I. The Current IT Security Regime</h2>\n\n<p>The federal governments current information security policy is defined primarily by two key documents, the Federal Information Security Management Act and the Privacy Act, and is operationalized through a patchwork of recommendations and requirements as outlined by the Federal Acquisition Register, the National Institute of Standards and Technology, and the Office of Management and Budget.</p>\n\n<h3 id='a_the_federal_information_security_management_act'>A. The Federal Information Security Management Act</h3>\n\n<p>The Federal Information Security Act (FISMA) is designed to provide a comprehensive framework for ensuring the effectiveness of information security controls over information resources that support Federal operations and assets.<sup id='fnref:10'><a href='#fn:10' rel='footnote'>10</a></sup> By harmonizing overlapping agency requirements, eliminating obsolete mandates, and updating outmoded provisions, the Act sought to unify congresss attempts over the past decade to address information security needs piecemeal through a scattered mosaic of legislation.<sup id='fnref:11'><a href='#fn:11' rel='footnote'>11</a></sup> Specifically, the acts purpose is protecting information and information systems from unauthorized access, use, disclosure, disruption, modification, or destruction to maintain the integrity, confidentiality, and availability of such information.<sup id='fnref:12'><a href='#fn:12' rel='footnote'>12</a></sup> FISMA has several components. First, it designates the Office of Management and Budget (OMB) director as having the authority to oversee agency information security policies and practices, requires agencies to provide information security protections, and oversee agency compliance with FISMA by conducting annual reviews and producing reports to congress.<sup id='fnref:13'><a href='#fn:13' rel='footnote'>13</a></sup> To facilitate this end, FISMA empowers OMB with the ability to approve or deny any federal agency’s information security system.<sup id='fnref:14'><a href='#fn:14' rel='footnote'>14</a></sup> Second, FISMA requires that the National Institute for Standards and Technology (NIST) promulgate the security standards by which agencies must comply.<sup id='fnref:15'><a href='#fn:15' rel='footnote'>15</a></sup> NIST fulfills this obligation by producing Federal Information Processing Standards Publication (FIPS) 199. FIPS 199 specifies guidelines for defining the potential impact of a security breach (low, medium, high) resulting in the loss of confidentiality, integrity, or availability of government information.<sup id='fnref:16'><a href='#fn:16' rel='footnote'>16</a></sup> Third, FISMA requires, a certification and accreditation (C&A) — measuring the efforts to define the appropriate risk on a system-by-system basis — and plan of action and milestones (POA&M) — measuring the compliance with established methodologies for correcting discrepancies.<sup id='fnref:17'><a href='#fn:17' rel='footnote'>17</a></sup> Both metrics are design to measure the people, process, and technology aspects of security, but arguably fail to properly represent true operational security on a system-by-system basis.<sup id='fnref:18'><a href='#fn:18' rel='footnote'>18</a></sup> Finally, the fourth requirement, under FISMA, plans and procedures must ensure continuity of operations [(COOP)] for information systems.<sup id='fnref:19'><a href='#fn:19' rel='footnote'>19</a></sup> Such provisions generally include backup systems, transitions plans, and security controls to maintain continuity of operations during loss of power, disasters, or other potential interruptions of service.<sup id='fnref:20'><a href='#fn:20' rel='footnote'>20</a></sup> FISMA does not require security at all costs, but instead, includes cost-effectiveness among its considerations, specifying that implementing policies and procedures to cost-effectively reduce risks to an acceptable level.<sup id='fnref:21'><a href='#fn:21' rel='footnote'>21</a></sup></p>\n\n<h3 id='b_the_privacy_act'>B. The Privacy Act</h3>\n\n<p>The second major body of security legislation governing agency’s actions is the Privacy Act of 1974.<sup id='fnref:22'><a href='#fn:22' rel='footnote'>22</a></sup> The act, which was passed on December 31st 1974 and went into effect in September of the subsequent year, serves the dual purpose of preventing the disclosure of individuals private information by government agencies that collect it (<em>e.g,</em> names or other identifying information linked to education, financial transactions, medial history, and criminal or employment history), and enabling individuals to determine what information has been collected, as well as to verify its accuracy.<sup id='fnref:23'><a href='#fn:23' rel='footnote'>23</a></sup> Specifically, as outlined in the E-Government Act of 2002, agencies are required to conduct a Privacy Impact Assessment (PIA) prior to "developing or procuring information technology or initiating a new collection of information in an identifiable form.<sup id='fnref:24'><a href='#fn:24' rel='footnote'>24</a></sup> For each new information collection, an agency must publish a PIA which addresses (1) What information is being collected; (2) why the information is being collected; (3) the intended use of the agency information; (4) with whom the information will be shared; (5) what notice or opportunities will be provided to individuals regarding what information is collected and with whom that information will be shared; (6) how the information will be secured, and (7) whether the information collection will create a system of records.<sup id='fnref:25'><a href='#fn:25' rel='footnote'>25</a></sup> PIAs are intended to be commensurate with the size of the information system being assessed, the sensitivity of information that is in an identifiable form in that system, and the risk of harm from unauthorized release of that information[.]<sup id='fnref:26'><a href='#fn:26' rel='footnote'>26</a></sup> The first step, then, for an agency wishing to deploy information technology, is to determine if such a system constitutes an information system under the act.</p>\n\n<p>The most recent definition of information system comes from the 1995 revision of the Paperwork Reduction Act (PRA).<sup id='fnref:27'><a href='#fn:27' rel='footnote'>27</a></sup> The PRA expanded the definition of information system from management information system to any discrete set of information resources organized for the collection, processing, maintenance, use, sharing, dissemination, or disposition of information.<sup id='fnref:28'><a href='#fn:28' rel='footnote'>28</a></sup> In this context, information resources include information and related resources, such as personnel, equipment, funds, and information technology.<sup id='fnref:29'><a href='#fn:29' rel='footnote'>29</a></sup> At the time of its drafting, information systems roughly corresponded with a physical machine (e.g., a computer). These physical boundaries made the task of privacy oversight relatively straightforward.<sup id='fnref:30'><a href='#fn:30' rel='footnote'>30</a></sup> It used to be in the 1960s and 1970s that government and large corporations centrally owned computers. With the emergence of the personal computer in the late 1970s and 1980s, this computing power was decentralized as the technology became affordable to the average consumer. This decentralizing trend continued into the 1990s with the explosion of mobile computing. Today however, we are seeing a shift back to a more centralized computing model. As hardware becomes commoditized, computing power is once again being aggregated into large datacenters, and becoming accessible much like a public utility.<sup id='fnref:31'><a href='#fn:31' rel='footnote'>31</a></sup> This blurring of physical lines — where multiple processes may live on the same physical machine, with many virtual machines being constantly created and destroyed — has the potential to significantly complicate an agency’s determination as to when an information system exists.</p>\n\n<p>The second threshold determination complicated by recent advances in technology is the creation of a system of records. The Privacy Act requires agencies to report the systems of records that they maintain, and for each system, to describe in detail, what information is collected, the purpose of the information collection, and to whom requests for records should be sent.<sup id='fnref:32'><a href='#fn:32' rel='footnote'>32</a></sup> Prior to establishing any new system of records, agencies are required to go through a lengthy approval process, and to publish a System of Records Notice (SORN) in the Federal Register.<sup id='fnref:33'><a href='#fn:33' rel='footnote'>33</a></sup> The term system of records is an artificial distinction, legislatively created to identify those groups of records to which the Privacy Act applies, and those to which it does not apply.<sup id='fnref:34'><a href='#fn:34' rel='footnote'>34</a></sup> The Act itself defines a system of records as a group of any records under the control of any agency from which information is retrieved by the name of the individual or by some identifying number, symbol, or other identifying particular assigned to the individual.<sup id='fnref:35'><a href='#fn:35' rel='footnote'>35</a></sup> This analysis involves examining the actual methods a given system uses to store and retrieve information.<sup id='fnref:36'><a href='#fn:36' rel='footnote'>36</a></sup></p>\n\n<p>Such an analysis is easy to apply to physical systems of records (e.g., records in a doctors office organized by patient name or social security number) or the first computerized databases that required strict, pre-defined indexes to efficiently retrieve a given record. Today however, most methods of storage and retrieval allow filtering records by any composite data element (e.g., SQL), and emerging methods of searching large amounts of data (e.g., MapReduce, Hadoop) have no index at all. This raises a question of how many retrievals of information using ones name or other unique identifier, even if not via a traditional index, are sufficient to create a system of records.<sup id='fnref:37'><a href='#fn:37' rel='footnote'>37</a></sup> Compounding the fact, industry best practices dictate that even the simplest systems — be they public facing systems like a website or internal business process systems to aid in agency functions — assign unique identifiers to each record to aid in retrieval. Clearly, the privacy act has not kept with the reality of how technology is actually used to support an agency’s mission.</p>\n\n<h3 id='c_formal_guidance'>C. Formal Guidance</h3>\n\n<p>The third source of agency responsibility comes from formal guidance promulgated by OMB and NIST. The Federal Acquisition Regulation (FAR) requires agencies to comply with OMB’s implementing policies including Appendix III of OMB Circular A-130, and guidance and standards from the Department of Commerce’s National Institute of Standards and Technology.<sup id='fnref:38'><a href='#fn:38' rel='footnote'>38</a></sup> Circular A-130 requires agencies to incorporate a system security plan within the information resource management (IRM) planning processes as required by the Paperwork Reduction Act. Such a plan must include established rules of behavior regarding each system, adequate training, personnel controls, incident response capabilities, continuity of support, technical security, and written agreements for any interconnections between systems. A-130 also outlines additional, more stringent requirements for major systems.</p>\n\n<p>NIST publishes its requirements as part of Federal Information Processing Standards (FIPS) 199 and Special Publication (SP) 800-53.<sup id='fnref:39'><a href='#fn:39' rel='footnote'>39</a></sup> FIPS 199 provides a framework for determining the level of risk associated with the potential impact to an organization or individuals for a given system across three metrics, confidentiality, integrity, and availability. This in turn, informs additional requirements depending on the appropriate security category assigned. SP 800-53 further outlines the various security controls agencies must implement for each system. Such controls include management controls — security assessment and authorization, risk assessment, system and services acquisition, and program management; operational controls — personnel security, physical and environmental security, contingency planning, configuration management, maintenance, systems and information integrity, media protection, incident response, and awareness and training; and technical controls — identification and authorization, access control, audit and accountability, and system and communications protection. Various other publications impose additional continuous monitoring requirements beyond the initiation certification and accreditation of the application.<sup id='fnref:40'><a href='#fn:40' rel='footnote'>40</a></sup></p>\n\n<h2 id='ii_mere_security_theater'>II. Mere Security Theater</h2>\n\n<p>The myriad guidance and labyrinthian requirements may do less to secure our nations information systems than policymakers hope. A number of critics have complained that the current security regime incentivizes agencies to perform a simple box checking paperwork exercise that does not keep up with an ever-expanding and ever-changing world of threats.<sup id='fnref:41'><a href='#fn:41' rel='footnote'>41</a></sup>  This paperwork drill puts into place and measures paper-based processes, rather than technical processes, for implementing information security,<sup id='fnref:42'><a href='#fn:42' rel='footnote'>42</a></sup> and ultimately fails to address the root causes of network exploitation: inadequate software quality assurance.<sup id='fnref:43'><a href='#fn:43' rel='footnote'>43</a></sup> Even assuming 100% compliance with all FISMA requirements, many believe that agencies would fail short of bolstering actual information security.<sup id='fnref:44'><a href='#fn:44' rel='footnote'>44</a></sup> Tim Bennett, President of the Cyber Security Industry Alliance lamented that FISMA grades not how well agencies have increased their information security, but rather how well agencies increase their compliance with the FISMA mandated processes.<sup id='fnref:45'><a href='#fn:45' rel='footnote'>45</a></sup> The current FISMA reports say absolutely nothing about government security but rather is merely a measure of compliance with report generation.<sup id='fnref:46'><a href='#fn:46' rel='footnote'>46</a></sup> FISMA erroneously assumes an agency’s compliance with largely reactionary standards is an objective measure of information security. Instead, such an assumption incentivizes agencies to expend efforts on reporting, rather than securing the underlying security threats the reports represent. <sup id='fnref:47'><a href='#fn:47' rel='footnote'>47</a></sup> By focusing on security audits, rather than the actual security of the systems, FISMA provides a framework for Federal Chief Information Officers (CIOs) to quantify their progress in ways a large, non-technical bureaucracy can easily digest. Rather than drill down on the number or nature of attacks deflected, CIOs could simply report that they have achieved a specified level of FISMA compliance, and thus could justify their various budgets.<sup id='fnref:48'><a href='#fn:48' rel='footnote'>48</a></sup> Put another way, FISMA provides little incentive to address any security metric outside of those explicitly required of agencies by FISMA.<sup id='fnref:49'><a href='#fn:49' rel='footnote'>49</a></sup></p>\n\n<p>This focus on reporting requirements and audits, rather than actual information security may explain why cyber attacks increased some 250% between 2007 and 2009.<sup id='fnref:50'><a href='#fn:50' rel='footnote'>50</a></sup> The Community Emergency Response Team (CERT) estimates that 95% of intrusions exploited known software vulnerabilities to which counter measures were readily available.<sup id='fnref:51'><a href='#fn:51' rel='footnote'>51</a></sup> [G]iven the fluid nature of the technology industry, and the reliance on cash flow to support operations, software manufacturers face increased pressure to rush their products to market in order to better capitalize on the product’s innovation.<sup id='fnref:52'><a href='#fn:52' rel='footnote'>52</a></sup> This creates a race to the bottom in terms of software quality. Given strict budgets and timelines, agencies our forced to either cut desired functionality or skirt security requirements.</p>\n\n<p>Such a focus on reporting and managerial controls rather than on actual information security manifests itself on a near daily basis. Despite our nations best efforts at complying with codified security policy, thousands of computers on our military networks were infected by malware,<sup id='fnref:53'><a href='#fn:53' rel='footnote'>53</a></sup> unauthorized users in Iran were able to gain access to blueprints and other information about a helicopter in the Presidents fleet,<sup id='fnref:54'><a href='#fn:54' rel='footnote'>54</a></sup> and the State and Defense departments have lost more than six terabytes of information due to digital espionage, an amount equal to one-sixth of the information contained within the Library of Congress.<sup id='fnref:55'><a href='#fn:55' rel='footnote'>55</a></sup> As recently as last year, even, a single Army private was able to steal more than a quarter of a million classified State Department cables and nearly 100,000 intelligence reports.<sup id='fnref:56'><a href='#fn:56' rel='footnote'>56</a></sup> This deficiency however, is not limited to threats to national security. It is also personal. The Department of Defense was hit with a $4.9 billion class action suit as a result of the theft of the personal information of some 4.9 million uniform service members and their families,<sup id='fnref:57'><a href='#fn:57' rel='footnote'>57</a></sup> the Department of Veterans Affairs lost health records and other sensitive personal information for approximately 26.5 million veterans and their spouses, <sup id='fnref:58'><a href='#fn:58' rel='footnote'>58</a></sup> the Secretary of Defenses unclassified e-mail was hacked,<sup id='fnref:59'><a href='#fn:59' rel='footnote'>59</a></sup> and the Navy CIO had his personal information compromised not once or twice, but on six distinct occasions.<sup id='fnref:60'><a href='#fn:60' rel='footnote'>60</a></sup></p>\n\n<h2 id='iii_problems_with_existing_information_security_policy'>III. Problems with Existing Information Security Policy</h2>\n\n<p>Existing security policies fail to properly protect federal information systems. Empirically, agencies are unable to implement the security controls as defined by FISMA. This may be due, at least in part, to the ambiguity that the anachronistic policy is shackled with when faced with modern advances in technology.  Even when requirements are clear, in the face of competing priorities, such obligations are often met with indifference by the agency, and unless a major attack brings cyber security to the forefront, such ambivalence is likely to remain within the public as well. Finally, Congress’s overall lack of attention renders accountability a challenge and as a result, compliance suffers further. As a result, the existing security regime may fall short of its promises.</p>\n\n<h3 id='a_difficulty_implementing'>A. Difficulty Implementing</h3>\n\n<p>Empirically, agencies struggle to meet the requirements asked of them by existing security policy. To date, none of the 24 major agencies<sup id='fnref:61'><a href='#fn:61' rel='footnote'>61</a></sup> have fully implemented the agency-wide information security programs required by FISMA.<sup id='fnref:62'><a href='#fn:62' rel='footnote'>62</a></sup> Seven of such agencies described their security as poor, nine as satisfactory, and only six as good.<sup id='fnref:63'><a href='#fn:63' rel='footnote'>63</a></sup> It should come as no surprise then, that the House Government Reform Committee rated government-wide FISMA compliance D+.<sup id='fnref:64'><a href='#fn:64' rel='footnote'>64</a></sup> In 2004, two years after the act was penned, 23% of federal IT systems lacked the risk assessment required, and one in four lacked the contingency plans necessary to ensure continuity of operations. Of those with plans, a mere 57% had ever been tested. Nine of the 24 agencies did not even have a complete inventory of their IT systems.<sup id='fnref:65'><a href='#fn:65' rel='footnote'>65</a></sup> Today, nearly a decade after FISMA was enacted, the numbers name similarly bleak. As the federal IT footprint continues to swell and budgets continue to tighten, in FY2011, only 33% of agencies reported compliance with FISMAs risk management requirements, and that same one in three ratio had programs compliant with the contingency planning requirements. Only one in four were complaint with configuration management, POA&M, and identify and access management requirements. Thirteen percent had not even begun to implement continuous monitoring programs, and one in five still lacked an accurate IT inventory.<sup id='fnref:66'><a href='#fn:66' rel='footnote'>66</a></sup></p>\n\n<p>One example of agencies difficulty complying can be seen in the Department of Veterans Affairs (VA). In 2007, a hard drive containing nearly 200,000 veterans records went missing from a safe. The Office of the Inspector General concluded that VAs security plan did not comply with its own rules for securing data, and that the it improperly allowed an IT Specialist trusted with the data access to information beyond his clearance.<sup id='fnref:67'><a href='#fn:67' rel='footnote'>67</a></sup> After public disclosure, a lawsuit arose, and when the opinion was issued nearly two years later, the court noted that it had no reason to think that all of the alleged violations have been remedied. Robert T. Howard, Assistant Secretary for Information and Technology, stated before a Senate inquiry that the hard drive theft was a wake up call and that [a]s a result of that incident we began to create the environment needed to better protect the sensitive information entrusted to us.<sup id='fnref:68'><a href='#fn:68' rel='footnote'>68</a></sup> Despite the lesson learned, four years after the incident, the VA reported that only 55% of its portable devices had FISMA-mandated encryption.<sup id='fnref:69'><a href='#fn:69' rel='footnote'>69</a></sup> A related case involving the Bureau of Indian Affairs (BIA) resulted in an injunction requiring that various IT systems be disconnected from the Internet due to inadequate information security. The court noted that the agency’s FISMA compliance had lagged behind the expansion of the department’s Internet presence.<sup id='fnref:70'><a href='#fn:70' rel='footnote'>70</a></sup> Although anecdotal, both incidents confirm one thing: on the whole federal agencies are unable to adequately implement the requirements imposed on them by FISMA.</p>\n\n<h3 id='b_ambiguity'>B. Ambiguity</h3>\n\n<p>Even when agencies strive to comply with their information security obligations, FISMAs requirements are often drafted so ambiguously as to render knowing what constitutes those obligations impossible.<sup id='fnref:71'><a href='#fn:71' rel='footnote'>71</a></sup> In certain situations, for example, agencies are required to ensure the security of federal data maintained on third-party systems. Yet the scope of this obligation, defined as when such systems are operated on behalf of an agency<sup id='fnref:72'><a href='#fn:72' rel='footnote'>72</a></sup> are often ambiguous, and the acts legislative history, sheds no additional light on the phrase. The congressional report accompanying FISMA is written largely in broad terms, with platitudes about the importance of information security. The legislative history in fact shows virtually no congressional focus on the nitty-gritty of how the statute is to be implemented.<sup id='fnref:73'><a href='#fn:73' rel='footnote'>73</a></sup> Additionally, FISMA proves no mechanism for agencies to clarify such ambiguities. This can result in months (or longer) of inaction as bureaucrats and lawyers struggle to interpret a technical statutory scheme with which they may have little familiarity.<sup id='fnref:74'><a href='#fn:74' rel='footnote'>74</a></sup> Perhaps among the most telling signs of FISMAs troubling ambiguity is the birth of a Beltway industry specializing in FISMA compliance. As one security-consulting firm asserted, Information Security and Privacy regulations are purposely vague to ensure they cover a wide range of organizations over a long period of time without having to be amended by Congress.<sup id='fnref:75'><a href='#fn:75' rel='footnote'>75</a></sup> FISMA as currently drafted does not provide adequate guidance for agencies to properly implement its requirements.</p>\n\n<h3 id='c_a_culture_of_indifference'>C. A Culture of Indifference</h3>\n\n<p>Compounding the difficulties surrounding implementation is a cloud of indifference. Most agency employees likely view FISMAs requirements as unsexy. Workers are more motivated and more productive, and thus more effective when they feel as if the task at hand is worthwhile.<sup id='fnref:76'><a href='#fn:76' rel='footnote'>76</a></sup> Yet an agencys information security activities have no direct connection to its substantive policy goals. It is not a stretch to think agency employees may lack the sense that operationalizing the requirements of FISMA is important or particularly urgent. Where bureaucrats are unmotivated, inertia can easily overpower the impetus to make costly changes as agency employees seek to minimize the amount of FISMA work they must do.<sup id='fnref:77'><a href='#fn:77' rel='footnote'>77</a></sup> Put another way, without clear alignment with policy goals, implementation of FISMA may be destined to remain good enough for government work.</p>\n\n<p>The court in the BIA case noted earlier<sup id='fnref:78'><a href='#fn:78' rel='footnote'>78</a></sup> described a bureaucratic culture marked by indifference, confusion, and lack of accountability. In one passage, the court discussed a stunning lack of management and oversight in the context of the departmental IT security program. While technicians were aware of the security vulnerabilities, they made no effort to fix it. The court continued, Interior’s IT security planners have discussed [the necessary fix] only in concept [One DOI senior official testified that he was] not aware that we’ve actually put it down into a formalized, written plan.<sup id='fnref:79'><a href='#fn:79' rel='footnote'>79</a></sup> Such indifference not only threatens FISMA implementation, but also can render our nations information systems less secure.</p>\n\n<h3 id='d_a_disinterested_public'>D. A Disinterested Public</h3>\n\n<p>This indifference towards FISMA however, does not remain solely within agency walls. While an enormous security breach could trigger widespread public outrage — for example if individuals tax filings were compromised — it is likely the issue would come to the forefront of taxpayers minds. Absent such a breach however, public interest is likely to remain low.<sup id='fnref:80'><a href='#fn:80' rel='footnote'>80</a></sup> This lack of public interest can create a complacency culture within Congress as well. It is difficult to imagine that members of congress receive many constituent inquiries into the pace of FISMA adoption, or believe that their ability to seek reelection is tied closely with the day-to-day success of agency CIOs. While citizens may cast ballots in favor of any issue they choose, FISMA is unlikely to come out of the periphery and receive enough public attention to strongly influence congressional elections, and thus one can assume that congressional oversight of information security is low on both Congress and the publics list of priorities.<sup id='fnref:81'><a href='#fn:81' rel='footnote'>81</a></sup></p>\n\n<h3 id='e_lack_of_accountability'>E. Lack of Accountability</h3>\n\n<p>Between employee indifference, public apathy, and general congressional inattentiveness, agencies compliance with FISMA mandated requirements (or lack thereof) receives far less scrutiny than work considered core to the agency’s mission. As such, deficiencies may go underappreciated or wholly unnoticed.<sup id='fnref:82'><a href='#fn:82' rel='footnote'>82</a></sup> Even when the OMB reporting process uncovers such failures, it is unlikely that consequences would result. [T]he nature of the stick is so draconian and counterproductive to agency effectiveness that it is hard to imagine OMB ever fully imposing it. Federal IT has led to vast improvements in the way government agencies deliver citizen services and perform day-to-day operations, and to punish an agency by retarding its ability to take advantage of such advancements seems unlikely. Further, given the sheer number of delinquent agencies, bureaucrats may be able to take solace in a safety in number mentality. While it may be unlikely that OMB would withhold funding from one or two deviant bureaus, the possibility of OMB mass sanctions would be both politically inconceivable and functionally disastrous, effectively neutralizing any threat implicit within OMBs oversight role.<sup id='fnref:83'><a href='#fn:83' rel='footnote'>83</a></sup></p>\n\n<h2 id='iv_the_toll_on_taxpayers'>IV. The Toll on Taxpayers</h2>\n\n<p>Such a heavyweight information security policy affects taxpayers in two ways. First, it imparts a direct administrative cost as agencies divert man-hours from supporting agency mission to supporting the significant administrative burden. Second, it imparts an indirect cost, as agencies struggle to operationalize new technologies and approaches to delivering citizen services and transacting the business of the nation.</p>\n\n<h3 id='a_administrative_costs'>A. Administrative Costs</h3>\n\n<p>As noted earlier, FISMA and its associated implementation guidance impose a significant administrative burden on agencies. For example, assume for a moment a federal agency would like to establish a simple blog for a newly announced short-term initiative. The blog would have the ability for agency staff members to post updates on the initiative on a regular basis, would provide members of the public with the opportunity to post comments, a fairly standard practice among similar websites. In the private sector, even a non-technical content creator or subject-matter expert could simply navigate to one of the many companies that provide blog hosting as a service, enter a purchase card number, and have an entire website standing in a matter of minutes. In the public sector however, such is not the case.</p>\n\n<p>The program office initiating the request would have to contact the chief security officer to begin the FIPS-199 mandated process of determining the FISMA risk level, which in this hypothetical, after some work, we can assume will be low given the non-essential nature of the website. Next, the software and the facilities that host it will be required to undergo a C&A process, even if another agencies has already done the same. This process may require a significant investment of time, if for example, the hosting company has not previously undergone an evasive third-party security audit, and may even require the agency to host the software within its own datacenter (thus adding to the overall cost), if the hosting company, presumably a small technology startup, is unwilling to do so. Once a primary datacenter and application is certified, COOP may also require a fallback, parallel website be concurrently established on a separate system (potentially doubling the cost imposed).</p>\n\n<p>Once technical requirements are met, significant legal requirements must be met as well. If, as in this hypothetical, the agency wishes to allow citizens who provide their name the opportunity to respond to each post (much like Facebook or most blogs on the Internet today), the agency will most likely be required to conduct a PIA, including the 30 and 60 day comment periods, receive OMB approval for the information collection under the PRA, and publish a SORN if comments are to be retrieved by the commenters name as is generally the case with blogs.</p>\n\n<p>Finally, the agency must establish a formal security plan and administrative controls. The security plan would outline rules of behavior, training, personnel controls, incident response, continuity of support, technical security, and if externally hosted as in the above example, an interconnection agreement with the third party hosting company. Last, the agency would have to design and fully implement SP 800 controls including management controls — security assessment and authorization, risk assessment, system and services acquisition, and program management; operational controls — personnel security, physical and environmental security, contingency planning, configuration management, maintenance, systems and information integrity, media protection, incident response, and awareness and training; and technical controls — identification and authorization, access control, audit and accountability, and system and communications protection.</p>\n\n<p>To fully comply with both the letter and spirit of current security requirements imposes a significant administrative cost on agencies. While such a heavy-handed procedure is proportional for our nations mission critical systems, a tension arises when that same procedure is applied to todays widely consumerized technologies.</p>\n\n<h3 id='b_opportunity_costs'>B. Opportunity Costs</h3>\n\n<p>Beyond the direct, administrative costs imposed by the governments existing security policy, there is also a hidden opportunity cost. Imagine, as in the above example, an agency looking to engage the public around a short, three-month policy initiative. Because the administrative overhead required by existing security policy may impose a three- to six-month lead-time on the agency before it could publish its first word, it is likely to forego the undertaking entirely. Such lost opportunities, however, on not simply limited to blogs. As the industry increasingly moves to a hosted service provider model (infrastructure as a service, platform as a service, and software as a service), such conflicts are going to become increasingly common, be they innovative new social networks, such as the code sharing service GitHub, or cloud-based business tools such as collaboration suite Basecamp or customer relationship manager Sales Force.</p>\n\n<p>Part of the private sectors success and recent explosion of web-based startups can be attributed, at least in part, to cloud computing’s ease of deployment and an emerging system oriented architecture. A single web developer can programmatically create and destroy servers on demand to meet scalability requirements, or simply to rapidly prototype a new application. At the same time, service providers are increasingly exposing their underlying data as application programming interfaces (APIs), allowing developers to loosely connect systems to rapidly bring products to market. Government developers however, are, without extensive administrative costs, precluded from fully taking advantage of these advances, and thus, the means by which citizens services are transacted, are often generations behind their private sector counterparts.</p>\n\n<h2 id='v_streamlining_our_nations_information_security'>V. Streamlining our Nations Information Security</h2>\n\n<p>The nations security policy is not inherently flawed, and for many, mission-critical IT systems, taxpayers should demand no less. For many, smaller, citizen-facing systems, however, a streamlined authorization process, could more rapidly deliver smarter, better tools with which government agency’s can transact the nations business, and do so in a secure and efficient manner.</p>\n\n<h3 id='a_carrots_not_sticks'>A. Carrots, not Sticks</h3>\n\n<p>OMB should seek to enforce FISMA compliance through incentives, rather than punishments. Agencies with exemplary security records should qualify for additional funding, resource, and personnel to induce others to follow their lead.<sup id='fnref:84'><a href='#fn:84' rel='footnote'>84</a></sup> Bureaucrats, by their nature, are hungry for additional funding. A concept analogously well established in the private sector, the prospect of greater resources at an agency CIOs disposal is a far greater motivator than simply seeking to avoid reprimand from OMB.<sup id='fnref:85'><a href='#fn:85' rel='footnote'>85</a></sup> Such incentives would not only encourage compliance with existing requirements, but would encourage CIOs to think critically about their security posture in hopes of a resource award, thus ameliorating the threat of mere box checking.</p>\n\n<h3 id='b_reduce_duplication_of_efforts'>B. Reduce Duplication of Efforts</h3>\n\n<p>Certifications and other collateral should be fully transferable between agencies. This would have two implications. First, when multiple agencies utilized a single information resource, OMB could determine which agency would bear the responsibility for ensuring FISMA compliance, saving the other agency (and taxpayers) significant time and effort.<sup id='fnref:86'><a href='#fn:86' rel='footnote'>86</a></sup>  Second, an application certified by one agency, should be fully transferable to another agency. In the blog example used earlier, if one agency had gone through the certification process for the service provider, another agency would be able to piggyback on their efforts. Such an arrangement may require OMB to curate a collaborative commons of shared certifications and records made available to agencies.</p>\n\n<h3 id='b_modular_administration'>B. Modular Administration</h3>\n\n<p>OMB should create guidance to decouple the various components that make up an information system from the administrative assets that must accompany them. An agency website, for example, might consist of the cloud service provider, the generic software image from which the server is based, and the content management system which generates the output visitors see (collectively, often referred to as the technology stack). As many of these components are often reused between information systems (e.g., another website may share the same server image, or database or other application may share the same datacenter), if agencies could analyze each component independently, in a modular fashion, it could realize a significant savings in the long run, especially if such certifications could be shared among agencies.</p>\n\n<h3 id='c_a_grace_period_for_pilot_programs'>C. A Grace Period for Pilot Programs</h3>\n\n<p>Last, and most importantly, congress should carve a pilot program exception within existing IT security requirements. Given the pace at which consumer technology advances, if agencies wish to adopt new, innovative services to improve non-mission critical lines of business, it should be permitted do so with limited administrative overhead. Even with a rigid adherence to industry best practices, given such a grace period, agencies could rapidly prototype and onboard new applications and services to better the delivery of citizen services, and if successful, can insure FISMA compliance in tandem with the ongoing pilot.</p>\n\n<h2 id='conclusion'>Conclusion</h2>\n\n<p>Congress has imposed on federal executive agencies an onerous system to ensure information security benchmarks are met. Despite OMBs best efforts, however, compliance is low, and successful attacks continue. Agencies are finding it difficult to complete the myriad requirements that become increasingly ambiguous in sight of an ever-changing technology landscape. This may be due, at least in part, to an indifference endemic of both agency implementers and the public. Such requirements, however, do not come without a cost. FISMA creates significant administrative overhead for agencies looking to innovate, and in many cases, may do so to such an extent so as to retard or otherwise prevent adoption. As a result, some argue, that by inhibiting such innovation, todays federal security policies simply secure nothing more than the status quo. As communications technology becomes increasingly consumerized, the opportunity for federal agencies to do more with less, and to streamline the delivery of, or expand into new citizen services has never been more apparent. Yet at the same time, public sector adoption is increasingly falling behind private sector counterparts. Instead, OMB should seek to incentivize those agencies that best secure federal information assets and seek out innovative, secure solutions to transacting the nations business. Existing requirements can be streamlined, such as breaking security analyses into the disparate technology components they represent, and allowing such modules of certification to be shared among systems and agencies. Finally, a formal grace period, for low risk, high impact, citizen-facing systems can usher in a new era of transparency and collaborative democracy yet unimagined. Our nations information is one of its chief resources, and great care should be taken to secure it, just as we secure our territories and tangible interests. Such security, however, and the overhead required to implement it, should be proportionate to the risk involved, and should secure the information systems of tomorrow, not simple the status quo.</p>\n<div class='footnotes'><hr /><ol><li id='fn:1'>\n<p>Ctr. for Strategic & Int’l Studies, Securing Cyberspace for the 44th Presidency 56 (2008), available at http://bit.ly/UvUVew</p>\n<a href='#fnref:1' rev='footnote'>↩</a></li><li id='fn:2'>\n<p>Chief Financial Officers Act of 1990, Pub.L. 101–576, Nov. 15, 1990, 104 Stat. 2838.</p>\n<a href='#fnref:2' rev='footnote'>↩</a></li><li id='fn:3'>\n<p>Office of Management and Budget Fiscal Year 2011 Report to Congress on the Implementation of the Federal Information Security Act of 2002, March 7, 2012, available at http://1.usa.gov/yiyFBb.</p>\n<a href='#fnref:3' rev='footnote'>↩</a></li><li id='fn:4'>\n<p>White House Office of the Press Secretary, Remarks by the President on Securing Our Nation’s Cyber Infrastructure (May 29, 2009), http://1.usa.gov/gvW7VM.</p>\n<a href='#fnref:4' rev='footnote'>↩</a></li><li id='fn:5'>\n<p>Cybersecurity: Next Steps To Protect Our Critical Infrastructure: Hearing Before the S. Comm. on Commerce, Science & Transportation, 111th Cong. (Feb. 23, 2010) (statement of Sen. Rockefeller) (A major cyberattack could shut down our nation’s most critical infrastructure….), http://1.usa.gov/TLlgEh; Senate Comm. on Commerce, Science & Transportation, Press Release, Rockefeller and Snowe Gain Momentum for Landmark Cybersecurity Act (Mar. 24, 2010) (statement of Sen. Snowe) (cyber intrusions and attacks represent both a potential national security and economic catastrophe), http://1.usa.gov/VXWVx0.</p>\n<a href='#fnref:5' rev='footnote'>↩</a></li><li id='fn:6'>\n<p>Marshall, Panetta Discusses Security Challenges in Stratcom Visit, American Forces Press Service, Aug. 5, 2011, http://1.usa.gov/pF6Fqx.</p>\n<a href='#fnref:6' rev='footnote'>↩</a></li><li id='fn:7'>\n<p>Rethinking Fisma and Federal Information Security Policy, 81 N.Y.U. L. Rev. 1844, 1846 (2006), citing Catriona Davies, US Army Computers Shut Down by Hacker, Daily Telegraph (London), July 28, 2005, at 11 (internal quotation marks omitted).</p>\n<a href='#fnref:7' rev='footnote'>↩</a></li><li id='fn:8'>\n<p>Ukman, Jason, 24,000 Pentagon files stolen in major cyber breach, officials say, Washington Post, July 14, 2011, available at http://wapo.st/o5wKnu.</p>\n<a href='#fnref:8' rev='footnote'>↩</a></li><li id='fn:9'>\n<p>Keneally, Meghan, Chinese government hacks into White House office in charge of the nuclear launch codes, Daily Maily, October 1st, 2012, available at http://bit.ly/SSZJgH.</p>\n<a href='#fnref:9' rev='footnote'>↩</a></li><li id='fn:10'>\n<p>44 U.S.C.A. 3541(1).</p>\n<a href='#fnref:10' rev='footnote'>↩</a></li><li id='fn:11'>\n<p>Robert Silvers, Rethinking Fisma and Federal Information Security Policy, 81 N.Y.U. L. Rev. 1844, 1847-48 (2006), citing <a>H.R. Rep. No. 107-787, pt.1, at 54 (2002)</a>, as reprinted in 2002 U.S.C.C.A.N. 1880, 1889 (noting that FISMA consolidates the Government Information Security Reform Act, Pub. L. No. 106-398, sec. 1061-65, 3531-36, 114 Stat. 1654A, 266-75 (2000), the Information Technology Management Reform (Clinger-Cohen) Act of 1996, Pub. L. No. 104-106, 5001-02, 110 Stat. 679, 679-80, the Computer Security Act,Pub. L. No. 100-235, 101 Stat. 1724 (1988), and the Paperwork Reduction Act of 1980, Pub. L. No. 96-511, 94 Stat. 2812).</p>\n<a href='#fnref:11' rev='footnote'>↩</a></li><li id='fn:12'>\n<p>44 U.S.C.A 3542(b)(1); see also FAR 2.101(b); 70 Fed. Reg. 57449, 57451 (Sept. 30, 2005).</p>\n<a href='#fnref:12' rev='footnote'>↩</a></li><li id='fn:13'>\n<p>44 U.S.C. 3543(a).</p>\n<a href='#fnref:13' rev='footnote'>↩</a></li><li id='fn:14'>\n<p>Id. 3543(a)(5).</p>\n<a href='#fnref:14' rev='footnote'>↩</a></li><li id='fn:15'>\n<p>44 U.S.C.A. 3543(a).</p>\n<a href='#fnref:15' rev='footnote'>↩</a></li><li id='fn:16'>\n<p>Federal Information Processing Standards Publication 199, Standards for Security Categorization of Federal Information Systems 1, 2 (Feb. 2004), available at http://1.usa.gov/4nzS8. FAR 11.102 and 11.201 include references to the FIPS PUB standards.</p>\n<a href='#fnref:16' rev='footnote'>↩</a></li><li id='fn:17'>\n<p>Daniel M. White, The Federal Information Security Management Act of 2002: A Potemkin Village, 79 Fordham L. Rev. 369, 380-81 (2010)</p>\n<a href='#fnref:17' rev='footnote'>↩</a></li><li id='fn:18'>\n<p>Id., citing Arthur Conklin, Why FISMA Falls Short: The Need for Security Metrics, 41 Wireless Internet S. Provider Proc. 1, 1-8 (2008), http:// www.tech.uh.edu/cae-dc/documents/WISP%202007%C20FISMA%C20metrics%C20paper% 20final.pdf; see also Agencies in Peril (internal citations omitted).</p>\n<a href='#fnref:18' rev='footnote'>↩</a></li><li id='fn:19'>\n<p>44 U.S.C.A. 3544(b)(8).</p>\n<a href='#fnref:19' rev='footnote'>↩</a></li><li id='fn:20'>\n<p><em>See generally</em> 12-3 BRPAPERS 1, 12-3 Briefing Papers 1, 11-12.</p>\n<a href='#fnref:20' rev='footnote'>↩</a></li><li id='fn:21'>\n<p>44 U.S.C.A. 3544(a)(2)(C); see also 44 U.S.C.A. 3544(b)(2)(B). NIST Special Publication 800-53A, Rev. 1, Guide for Assessing the Security Controls in Federal Information Systems and Organizations 3, 1.1 (June 2010) ([O]rganizations have the inherent flexibility to determine the level of effort needed for a particular assessment This determination is made on the basis of what will accomplish the assessment objectives in the most cost-effective manner and with sufficient confidence to support the subsequent determination of the resulting mission or business risk.), available at http://1.usa.gov/aDXfog.</p>\n<a href='#fnref:21' rev='footnote'>↩</a></li><li id='fn:22'>\n<p>5 U.S.C.A 552a.</p>\n<a href='#fnref:22' rev='footnote'>↩</a></li><li id='fn:23'>\n<p><em>Id</em>552a(b), (d), (f)<em>.</em></p>\n<a href='#fnref:23' rev='footnote'>↩</a></li><li id='fn:24'>\n<p>44 U.S.C.A. 3601 <em>et seq</em>.</p>\n<a href='#fnref:24' rev='footnote'>↩</a></li><li id='fn:25'>\n<p><em>Id.</em> <em>See generally,</em> Shahid Khan, <u>"Apps.gov": Assessing Privacy in the Cloud Computing Era</u>, 11 N.C.J.L. & Tech. On. 259, 272 (2010).</p>\n<a href='#fnref:25' rev='footnote'>↩</a></li><li id='fn:26'>\n<p>44 U.S.C.A. 3601 at 208(b)(2)(b)(i), 116 Stat. 2922.</p>\n<a href='#fnref:26' rev='footnote'>↩</a></li><li id='fn:27'>\n<p>44 U.S.C.A. 3501.</p>\n<a href='#fnref:27' rev='footnote'>↩</a></li><li id='fn:28'>\n<p>44 U.S.C.A. 3502(14) (1994), amended by Paperwork Reduction Act of 1995, Pub. L. No. 104-13, 3502(8), 109 Stat. 163, 166.</p>\n<a href='#fnref:28' rev='footnote'>↩</a></li><li id='fn:29'>\n<p><em>Id.</em>at 3502(6).</p>\n<a href='#fnref:29' rev='footnote'>↩</a></li><li id='fn:30'>\n<p><em>See</em> Shahid Khan, <u>"Apps.gov": Assessing Privacy in the Cloud Computing Era</u>, 11 N.C.J.L. & Tech. On. 259, 273 (2010)</p>\n<a href='#fnref:30' rev='footnote'>↩</a></li><li id='fn:31'>\n<p>Shahid Khan, <u>"Apps.gov": Assessing Privacy in the Cloud Computing Era</u>, 11 N.C.J.L. & Tech. On. 259, 273-74 (2010)</p>\n<a href='#fnref:31' rev='footnote'>↩</a></li><li id='fn:32'>\n<p>5 U.S.C.A. 552a(e) (2000).</p>\n<a href='#fnref:32' rev='footnote'>↩</a></li><li id='fn:33'>\n<p>About Privacy Act Issuances, Government Printing Office, available at http://1.usa.gov/QSzFBE.</p>\n<a href='#fnref:33' rev='footnote'>↩</a></li><li id='fn:34'>\n<p>Julianne M. Sullivan, Will the Privacy Act of 1974 Still Hold Up in 2004? How Advancing Technology Has Created A Need for Change in the "System of Records" Analysis, 39 Cal. W. L. Rev. 395, 399 (2003) (adding The definition provided by the Privacy Act is not based on the ordinary, plain meaning of the words system of records, but is in fact a very specific type of system, with very particular rules. This distinction likely arose out of the need to create some kind of distinction between groups of records that should be accessible and those that should not.)</p>\n<a href='#fnref:34' rev='footnote'>↩</a></li><li id='fn:35'>\n<p>5 U.S.C.A. 552(a)(5) (2000).</p>\n<a href='#fnref:35' rev='footnote'>↩</a></li><li id='fn:36'>\n<p>Henke v. Dept of Commerce, 83 F.3d 1553, 1459-60 (D.C. Cir. 1996).</p>\n<a href='#fnref:36' rev='footnote'>↩</a></li><li id='fn:37'>\n<p><em>See</em> Sullivan, citing Henke, 83 F.3d at 1461 (noting that when records are compiled for investigatory purposes, even a few retrievals might be sufficient to create a system of records).</p>\n<a href='#fnref:37' rev='footnote'>↩</a></li><li id='fn:38'>\n<p>FAR 7.103(w). <em>See</em> generally Appendix III to OMB Circular No. A-130, Office of Management and Budget, available at http://1.usa.gov/RlibPS.</p>\n<a href='#fnref:38' rev='footnote'>↩</a></li><li id='fn:39'>\n<p>FIPS PUB 199, Standards for Security Categorization of Federal Information and Information Systems 1, 2 (Feb. 2004), <a>http://1.usa.gov/4nzS8</a>; NIST Special Publication 800-53, Rev. 3, Recommended Security Controls for Federal Information Systems and Organizations 6, 2.1 (May 2010), http://1.usa.gov/9DDoih.</p>\n<a href='#fnref:39' rev='footnote'>↩</a></li><li id='fn:40'>\n<p><em>See, e.g.,</em>OMB Memorandum 11-33, FY 2011 Reporting Instructions for the Federal Information Security Management Act and Agency Privacy Management (Sept. 14, 2011) (enclosing DHS Memorandum FISM 11-02 (Aug. 24, 2011)), http://1.usa.gov/oCB4it.</p>\n<a href='#fnref:40' rev='footnote'>↩</a></li><li id='fn:41'>\n<p>More Security, Less Waste: What Makes Sense for Our Federal Cyber Defense: Hearing Before the Subcomm. on Federal Financial Management, Government Information, Federal Services & International Security, 111th Cong. (Oct. 29, 2009) (statement of Sen. McCain), http://1.usa.gov/Ut4XuK.</p>\n<a href='#fnref:41' rev='footnote'>↩</a></li><li id='fn:42'>\n<p>GovWin Says FISMA Fails to Improve Overal Security, March 16, 2006, http://bit.ly/UxOqHX.</p>\n<a href='#fnref:42' rev='footnote'>↩</a></li><li id='fn:43'>\n<p>Daniel M. White, The Federal Information Security Management Act of 2002: A Potemkin Village, 79 Fordham L. Rev. 369, 372 (2010) (internal citations omitted).</p>\n<a href='#fnref:43' rev='footnote'>↩</a></li><li id='fn:44'>\n<p>See Wm. Arthur Conklin, Why FISMA Falls Short: The Need for Security Metrics, 41 Wireless Internet S. Provider Proc. 1, 1-8 (2008), http://bit.ly/TMqEqM; <em>see also</em> Agencies in Peril: Are We Doing Enough to Protect Federal IT and Secure Sensitive Information?: Hearing Before the S. Subcomm. on Fed. Fin. Mgmt., Gov’t, Info., Fed. Servs., and Int’l Sec., 110th Cong. 1 (2008) at 2-6 (statement of Tim Bennett, President of Cyber Security Industry Alliance) (identifying general flaws in FISMA reporting); Angela Gunn, Fed Having Fits over FISMA and Cybersecurity, Betanews (Dec. 12, 2008), http://bit.ly/X16RFd. <em>See</em> <em>generally</em> Daniel M. White, The Federal Information Security Management Act of 2002: A Potemkin Village, 79 Fordham L. Rev. 369, 380 (2010).</p>\n<a href='#fnref:44' rev='footnote'>↩</a></li><li id='fn:45'>\n<p><em>Id.</em></p>\n<a href='#fnref:45' rev='footnote'>↩</a></li><li id='fn:46'>\n<p>Vijayan, Jaikumar, Critics question value of federal IT security report card, Computerworld, May 21st, 2008, available at http://bit.ly/QSHKX0 (noting some agencies that are making an effort to comply with the true intent of the 396-page FISMA requirements document are getting poor grades on the annual report card, while others that have treated the process as a mere paperwork exercise are getting good grades.) (Internal quotation marks omitted).</p>\n<a href='#fnref:46' rev='footnote'>↩</a></li><li id='fn:47'>\n<p>White, 79 Fordham L. Rev 369, 382; <em>Id.</em>(commenting First, Congress creates waste by writing FISMA in a way that demands useless reporting, and then it highlights the useless scores in a way that in some cases provides incentives for federal agencies to deliver misleading results.).</p>\n<a href='#fnref:47' rev='footnote'>↩</a></li><li id='fn:48'>\n<p>White, 79 Fordham L. Rev 369, 381-82.</p>\n<a href='#fnref:48' rev='footnote'>↩</a></li><li id='fn:49'>\n<p><em>Id.</em> at 381.</p>\n<a href='#fnref:49' rev='footnote'>↩</a></li><li id='fn:50'>\n<p>Gregg Carlstrom, Net Attacks Triple in 2 Years, Fed. Times (Aug. 3, 2009), http://bit.ly/RluFaf. (a conservative estimated since many agencies under report by as much as 50%, and since the statistic excludes the Department of Defense which receives millions of scans and probes each year.); <em>See also</em>Conklin, 41 Wireless Internet S. Provider Proc. 1 (The recent spate of highly publicized information security failures in Federal agencies highlight the limitations of the current FISMA based approach…. The fact that … some agencies have not had an information security failure[s may be due to] lack of knowledge.). <em>See generally id.</em>at 382.</p>\n<a href='#fnref:50' rev='footnote'>↩</a></li><li id='fn:51'>\n<p>Kevin R. Pinkney, Putting Blame Where Blame Is Due: Software Manufacturer and Customer Liability for Security-Related Software Failure, 13 Alb. L.J. Sci. & Tech. 43, 66 (2002).</p>\n<a href='#fnref:51' rev='footnote'>↩</a></li><li id='fn:52'>\n<p>White, 79 Fordham L. Rev 369, 384.</p>\n<a href='#fnref:52' rev='footnote'>↩</a></li><li id='fn:53'>\n<p>White House Office of the Press Secretary, Remarks by the President on Securing Our Nation’s Cyber Infrastructure (May 29, 2009), http://1.usa.gov/gvW7VM (In one of the most serious cyber incidents to date against our military networks, several thousand computers were infected [in 2008] by malicious software–malware.).</p>\n<a href='#fnref:53' rev='footnote'>↩</a></li><li id='fn:54'>\n<p>Source in Iran Sees Plans for President’s Chopper, USA Today, Mar. 2, 2009 (The U.S. Navy is investigating how an unauthorized user in Iran gained online access to blueprints and other information about a helicopter in President Obama’s fleet.)</p>\n<a href='#fnref:54' rev='footnote'>↩</a></li><li id='fn:55'>\n<p>Cybersecurity: Assessing the Nation’s Ability To Address the Growing Cyber Threat: Hearing Before the H. Comm. on Oversight & Government Reform, 112th Cong. (July 7, 2011) (statement of Rep. Issa), http://1.usa.gov/T2MnvZ.</p>\n<a href='#fnref:55' rev='footnote'>↩</a></li><li id='fn:56'>\n<p>Information Sharing in the Era of WikiLeaks: Balancing Security and Collaboration: Hearing Before the S. Comm. on Homeland Security & Government Affairs, 112th Cong. (Mar. 10, 2011) (statement of Sen. Collins), http://1.usa.gov/TMnAuL.</p>\n<a href='#fnref:56' rev='footnote'>↩</a></li><li id='fn:57'>\n<p>Kime, DOD Hit With Lawsuit Over Lost Tricare Data, ArmyTimes, Oct. 13, 2011, http://bit.ly/ro8C2E.</p>\n<a href='#fnref:57' rev='footnote'>↩</a></li><li id='fn:58'>\n<p>S. Rep. No. 111-110, at 3 (Dec. 17, 2009).</p>\n<a href='#fnref:58' rev='footnote'>↩</a></li><li id='fn:59'>\n<p>Cybersecurity: Assessing Our Vulnerabilities and Developing an Effective Response: Hearing Before the S. Comm. on Commerce, Science & Transportation, 111th Cong. 8 (Mar. 19, 2009) (statement of Dr. James Lewis), http://1.usa.gov/QT2gXm.</p>\n<a href='#fnref:59' rev='footnote'>↩</a></li><li id='fn:60'>\n<p>Chabrow, Navy CIO’s PII Exposed for Sixth Time, Gov’t Info. Sec. News, Jan. 4, 2010, http://bit.ly/QT2nlU.</p>\n<a href='#fnref:60' rev='footnote'>↩</a></li><li id='fn:61'>\n<p>As outlined in the Chief Financial Officer Act.</p>\n<a href='#fnref:61' rev='footnote'>↩</a></li><li id='fn:62'>\n<p>No Computer System Left Behind: A Review of the 2005 Federal Computer Security Scorecards Before the H. Comm. on Government Reform, 109th Cong. 32 (2006) (statement of Gregory C. Wilshusen, Director, Information Security Issues, United States Government Accountability Office).</p>\n<a href='#fnref:62' rev='footnote'>↩</a></li><li id='fn:63'>\n<p>Robert Silvers, Rethinking Fisma and Federal Information Security Policy, 81 N.Y.U. L. Rev. 1844, 1850 (2006).</p>\n<a href='#fnref:63' rev='footnote'>↩</a></li><li id='fn:64'>\n<p><em>Id.,</em> citing House Comm. on Gov’t Reform, 109th Cong., Computer Security Report Card 1 (2006).</p>\n<a href='#fnref:64' rev='footnote'>↩</a></li><li id='fn:65'>\n<p>OMB 2004 FISMA Report. <em>See Generally</em> Silvers, 81 N.Y.U. L. Rev. <em>at</em> 1850.</p>\n<a href='#fnref:65' rev='footnote'>↩</a></li><li id='fn:66'>\n<p>OMB 2011 FISMA Report, Table A, <em>available at</em> http://1.usa.gov/yiyFBb.</p>\n<a href='#fnref:66' rev='footnote'>↩</a></li><li id='fn:67'>\n<p>Fanin v. U.S. Dep’t of Veterans Affairs, 572 F.3d 868, 870-71 (11th Cir. 2009).</p>\n<a href='#fnref:67' rev='footnote'>↩</a></li><li id='fn:68'>\n<p>Agencies in Peril: Are We Doing Enough to Protect Federal IT and Secure Sensitive Information?: Hearing Before the S. Subcomm. on Fed. Fin. Mgmt., Gov’t, Info., Fed. Servs., and Int’l Sec., 110th Cong. 1 (2008).</p>\n<a href='#fnref:68' rev='footnote'>↩</a></li><li id='fn:69'>\n<p>OMB 2011 FISMA Report, Figure 8.</p>\n<a href='#fnref:69' rev='footnote'>↩</a></li><li id='fn:70'>\n<p>Cobell v. Norton, 394 F. Supp. 2d 164 (D.D.C. 2005). <em>See generally</em> Silvers, 81 N.Y.U. L. Rev. 1844, 1849-63; White, 79 Fordham L. Rev. 369, 378.</p>\n<a href='#fnref:70' rev='footnote'>↩</a></li><li id='fn:71'>\n<p><em>See</em> Silvers, 81 N.Y.U. L. Rev. 1844, 1853.</p>\n<a href='#fnref:71' rev='footnote'>↩</a></li><li id='fn:72'>\n<p>In addition, contractors and third-party service providers are implicated under 44 U.S.C.A. 3544(a)-(b) (including information systems provided or managed by contractor, or other source), and DHS Memorandum FISM 11-02 as enclosed in OMB Memorandum M-11-33, Reporting Instructions for the Federal Information Security Management Act and Agency Privacy Management (DHS identified… contractors most likely to be subject to FISMA requirements [including] Service providers–e.g… managed services, like subscriptions to software services.).</p>\n<a href='#fnref:72' rev='footnote'>↩</a></li><li id='fn:73'>\n<p><em>Id.</em>, citing H.R. Rep. No. 107-787, pt.1, at 76-88 (2002).</p>\n<a href='#fnref:73' rev='footnote'>↩</a></li><li id='fn:74'>\n<p><em>Id</em>. at 1853.</p>\n<a href='#fnref:74' rev='footnote'>↩</a></li><li id='fn:75'>\n<p>NetIQ, NetIQ FISMA Compliance & Risk Management Solutions 2 (2005), available at http://bit.ly/UyC9TJ. <em>See generally</em> White, 79 Fordham L. Rev. 369, 405.</p>\n<a href='#fnref:75' rev='footnote'>↩</a></li><li id='fn:76'>\n<p>Silvers, 81 N.Y.U. L. Rev. 1844, 1859-60 (citing L.L. Cummings & Donald P. Schwab, Performance in Organizations: Determinants and Appraisals 90-101 (1973)).</p>\n<a href='#fnref:76' rev='footnote'>↩</a></li><li id='fn:77'>\n<p><em>Id.</em></p>\n<a href='#fnref:77' rev='footnote'>↩</a></li><li id='fn:78'>\n<p><em>Supra</em> footnote 69 <em>et seq.</em></p>\n<a href='#fnref:78' rev='footnote'>↩</a></li><li id='fn:79'>\n<p>Silvers, 81 N.Y.U. L. Rev. 1844, 1852 (2006) (citing 394 F. Supp. 2d at 261 (quoting trial testimony of W. Hord Tipton, Chief Information Officer, DOI)).</p>\n<a href='#fnref:79' rev='footnote'>↩</a></li><li id='fn:80'>\n<p><em>Id.</em> at 1860-61.</p>\n<a href='#fnref:80' rev='footnote'>↩</a></li><li id='fn:81'>\n<p><em>Id. See</em> Jack M. Beermann, Essay, Administrative Failure and Local Democracy: The Politics of DeShaney, 1990 Duke L.J. 1078, 1105 ([A]dministrative failures may be so low on the political agenda that they will not even be addressed in the electoral process.).</p>\n<a href='#fnref:81' rev='footnote'>↩</a></li><li id='fn:82'>\n<p><em>Id.</em> at 1862, <em>See</em> Beermann, supra note 79, at 1106 ([U]nelected agents are shielded from direct political scrutiny. Thus, given the difficulty of effective oversight, agency actions may not be brought into line with legislatively stated goals.).</p>\n<a href='#fnref:82' rev='footnote'>↩</a></li><li id='fn:83'>\n<p>Silver, 81 N.Y.U. L. Rev. at 1862.</p>\n<a href='#fnref:83' rev='footnote'>↩</a></li><li id='fn:84'>\n<p><em>See generally</em>Silvers, 81 N.Y.U. L. Rev. 1844 at 1868.</p>\n<a href='#fnref:84' rev='footnote'>↩</a></li><li id='fn:85'>\n<p><em>Id.</em> (citing Randal O’Toole, Reforming the Forest Service 104 (1988) (For top managers, larger budgets mean greater prestige. For middle managers, larger budgets mean more people on their staff, and this generally provides them with higher salaries. For lower managers, larger budgets mean greater opportunities for advancement.)).</p>\n<a href='#fnref:85' rev='footnote'>↩</a></li><li id='fn:86'>\n<p><em>Id.</em> at 1871.</p>\n<a href='#fnref:86' rev='footnote'>↩</a></li></ol></div> ]]></description>\n </item><item>\n <title>Deprecate Management</title>\n <link>http://ben.balter.com/2012/12/16/deprecate-management/</link>\n <pubDate>Sun, 16 Dec 2012 00:00:00 +0000</pubDate>\n <dc:creator>Benjamin J. Balter</dc:creator>\n <category>Open Souce</category>\n <category>GitHub</category>\n <category>Collaboration</category>\n <category>Management</category>\n <guid isPermaLink=\"false\">/2012/12/16/deprecate-management</guid>\n <description><![CDATA[ <p>There are many aspects to “making things” that <a href='http://ben.balter.com/2012/10/19/we-ve-been-trained-to-make-paper/'>open source just does better</a>. Regardless of if at the end of the day you ship bits or cogs, certain aspects of “office” work are universal: ideation, vetting initiatives, resolving conflicts, and shipping product. Now imagine if you had to do all this not across conference tables, but across geographies and timezones. You’d have a pretty kickass process for sure. Welcome to the world of open source.</p>\n\n<p>Think about it this way: in the traditional office setting, we use management to facilitate this collaborative building process. Management does many things, but at the most basic level they:</p>\n\n<ul>\n<li>Shuttle Information</li>\n\n<li>Coordinate across business units</li>\n\n<li>Align efforts to organazation priorities</li>\n\n<li>Make sure people do work</li>\n\n<li>Recruit new people</li>\n</ul>\n\n<p>This makes sense if you look at the history of the role. In an age when conveying information was onerous, the only way for Adam to tell Becky what he was working on (and thus to ensure Becky was not duplicating efforts) was to stop what he was doing, walk down the hall, and interrupt Becky. So instead of doing this every day, we hire Charlie to facilitate a standing meeting and shuttle that information back and forth. Makes sense.</p>\n\n<p>But what if when that problem first arose Adam could send Becky an e-mail or an IM or post a update to a shared collaboration space. Do you think they’d need Charlie in the first place? Would management as we see it today have arisen in an age where technology reduces the friction of collaboration to nearly nil?</p>\n\n<p>Take the open source community, as a test case, which was afforded just such a unique opportunity. Same problem, same outcome, and (for the most part), no traditional hierarchical structure. How do you overcome the management burden? Transparent, persistent communication — everything from code to decisions happen in the open and are archived for all to see — and pure meritocracy — a bunch of ideas arise and are voted on (through opt-in participation) and the best are seen to fruition.</p>\n\n<p>But does it <del>blend</del> scale? WordPress, the open source content management system had nearly 300 individual contributors to its latest release, in just under four months, all working on a single project downloaded more than a million times within days of its release. And there’s no reason this process has to be limited to software. Collaboration is collaboration.</p>\n\n<p>So what aspects of the open source process make this management free collaboration possible? Ryan Tomayko <a href='http://tomayko.com/writings/adopt-an-open-source-process-constraints'>outlines</a> his experience applying the open source philosophy to an entire (for-profit) venture, noting four key features to the system:</p>\n\n<blockquote>\n<ul>\n<li>\n<p><strong>Electronic</strong>: Discussion, planning, and operations process should use a high fidelity form of electronic communication like email, github.com, or chat with transcripts wherever possible. Avoid meatspace discussion and meetings.</p>\n</li>\n\n<li>\n<p><strong>Available</strong>: Work should be visible and expose process. Work should have a URL. It should be possible to move backward from a piece of product or a system failure and understand how it came to be that way. Prefer git, issues, pull requests, mailing lists, and chat with transcripts over URL-less mediums.</p>\n</li>\n\n<li>\n<p><strong>Asynchronous</strong>: Almost no part of the product development process requires that one person interrupt another’s immediate attention or that people be in the same place at the same time, or even that people be in different places at the same time. Even small meetings or short phone calls can wreck flow so consider laying it out in (a thought out) email or sending a pull request instead.</p>\n</li>\n\n<li>\n<p><strong>Lock free</strong>: Avoid synchronization / lock points when designing process. This is <a href='http://en.wikipedia.org/wiki/Distributed_revision_control'>distributed version control</a> writ large. We don’t have a development manager that grants commit bit to repositories before you can do work, or a release manager that approves deploys, or a product manager that approves work on experimental product ideas. Work toward a goal should never be blocked on approval. Push approval/rejection to the review stage or automate it, but surface work early to get feedback.</p>\n</li>\n</ul>\n</blockquote>\n\n<p>Granted, this open-source philosophy doesn’t apply to every workplace, but how much better would the process of “making things” be if we could eliminate traditional pain points of managerial friction entirely — conference calls, status meetings, “sync ups”, and other non-decisional “checkins”. Work happens in the open, rather than hidden away in one-on-one e-mails or behind closed doors, and decisions are made by those who show up to do the work.</p>\n\n<p>Straddling the line between arguably the world’s most bureaucratic, hierarchical organization (federal government), and it’s definitional polar opposite (open source), provides a unique perspective. There are so many aspects to the work day that we do just because it’s the way thing’s have been done since the dawn of the industrial revolution, and it puzzles me why nobody’s stopped to think critically about how these processes could be remained in an age of technology. I need to get this (physical) form approved by Joan? Okay, you just took two people away from doing what they’re paid to be doing. I have to e-mail someone for the latest version or figure out where we’re at on this project? Again, just moved someone from high-level to low-level work.</p>\n\n<p>Granted, not every workplace is apt for such radical egalitarianism, but the buttoned up (offline) world of “serious” business could learn a thing or two from open source’s collaborative experiment. In many respects, organizational friction is no longer a sunk cost, and thus, arguably, so too is management.</p> ]]></description>\n </item><item>\n <title>Why WordPress's next version should just give it a REST already</title>\n <link>http://ben.balter.com/2012/12/15/why-wordpress-next-version-should-just-give-it-a-rest-already/</link>\n <pubDate>Sat, 15 Dec 2012 00:00:00 +0000</pubDate>\n <dc:creator>Benjamin J. Balter</dc:creator>\n <category>WordPress</category>\n <category>REST</category>\n <category>API</category>\n <category>Technology</category>\n <guid isPermaLink=\"false\">/2012/12/15/why-wordpress-next-version-should-just-give-it-a-rest-already</guid>\n <description><![CDATA[ <p>The internet has a particular way of solving difficult technical challenges. We try a bunch of diverse approaches out, keep only the most elegant, and quickly forget the rest ever happened. That’s why the Web is the Internet’s preeminent service (as apposed to say <a href='http://en.wikipedia.org/wiki/Gopher_%28protocol%29'>Gopher</a>), clicking the logo in the top left corner of almost any site goes to that site’s homepage, and typing a URL in your browser retrieves that particular page. These aren’t just design conventions in the sense that a lot of people like them, but rather represent the purposeful result of trial and error.</p>\n\n<p>Over the past few years, as sites become more mature and even more inter-connected, the internet has been coalescing around one such pattern (known as <a href='http://en.wikipedia.org/wiki/Representational_state_transfer'>REST</a>). The idea is simple: a URL should uniquely identify the underlying data it represents. If I have a URL, I shouldn’t need anything else to view or otherwise manipulate the information behind it.</p>\n\n<p>WordPress, for the most part, does this well. Each post is given a unique permalink (e.g., <code>2012-12-15-why-wordpress...</code>) that always points to that post. The problem is, however, in WordPress’s sense, it points to the <em>display</em> of that content, not the content itself. When editing, for example, that same content may be represented as <code>/wp-admin/post.php?p=1234</code>, clearly a different URL, and if you’d like to programmatically access the underlying data (say to build a mobile app, or some sort of external widget), you’re pretty much SOL in terms of WordPress’s core vision.</p>\n\n<p>Why does such a nuance matter? Take a look at the direction the net’s heading. We’re separating content (say, the post itself), from the presentation layer that holds it hostage (say the theme’s template), so that we can use it in many, many different ways without starting from scratch. This goes on behind the scenes in many ways you may not even notice, and that’s the point. By enabling programatic access of the underlying data, that same post can be read via a mobile app, a feed reader, posted to a social network, or even embedded within another site altogether.</p>\n\n<p>Websites are quickly becoming the curators of information, not simply the presenters of it. It’s a return to content management in its purest form. It’s exposing content as a service, and it’s coming whether we want it or not.</p>\n\n<p>WordPress came about as many of these now-standard design conventions were still emerging, and understandably, it doesn’t exactly embrace them head on. Yet next generation content management systems — not weighed down by history — have an advantage here, and as folks look to build the next generation of websites, they’re obviously going to be looking to where we’re going, not where we’ve been.</p>\n\n<p>If WordPress wants to stay relevant as a content management platform, the future isn’t traditional post-and-forget blogging, but rather a concerted effort to once again make content king. We obviously can’t flip a switch and get there overnight, but a crawl, walk, run over the next version or so can better align the veritable CMS with the reality of what’s in our collective pipeline:</p>\n<!-- more -->\n<h3 id='crawl'>Crawl</h3>\n\n<p>At the very least, lets expose all WordPress content in a machine readable format. This once and for all breaks the content-presentation link. We do this with RSS somewhat, but it’s time to put non-HTML formats on equal footing with HTML in core’s eyes.</p>\n\n<ol>\n<li>\n<p>Create a new format (“feed” in WordPress parlance) called JSON, and add the necessary rewrite rules such that I can simply add <code>.json</code> to any post and retrieve the underlying information in a machine readable format. This should contain not only the content of the post and information that would normally be accessible via HTML, but all the fields of the posts table (e.g., date published, date updated, etc.), all the post’s metadata (custom, post-specific information), and all the associated term objects (tags, categories, etc.). Sure we’ll need to add a filter or two in their to ensure information folks want private stays private, but from a technical standpoint, we’re talking a handful of lines of code here.</p>\n</li>\n\n<li>\n<p>Extend that format to indexes (archives in WordPress terms). Again, just as above, every list of posts (by date, by associated term, search results) should have the capability to exose the list in that same machine-readable format. This allows for the programatic discovery of information. A little bit harder than #1, but again, nothing crazy here. Pretty basic stuff.</p>\n</li>\n</ol>\n\n<h3 id='walk'>Walk</h3>\n\n<p>Access to content is half the equation. Allow programatic management of WordPress content as well. Conceptually, this is nothing radical. WordPress allows remote management of content through the <a href='http://en.wikipedia.org/wiki/XML-RPC'>XML-RPC</a> protocol, a blog-specific format that was designed some 15 years ago. We’re just talking about an upgrade.</p>\n\n<ol>\n<li>\n<p>Use the existing <code>admin-ajax</code> infrastructure to consistently expose administrative functions in a programtic ways. For example, POSTing to <code>admin-ajax.php?action=create</code> should allow me to create a new post, just as <code>admin-ajax.php?action=update&p=123</code> or <code>?action=delete&p=1234</code> should do the same. Again, the basic plumbing’s already there, it’s just a matter of abstracting it out and aligning with modern conventions.</p>\n</li>\n\n<li>\n<p>Pick a few high-priority pieces of backend functionality to prototype, such as listing posts or editing an existing post, and rather than reloading the entire administrative interface every time I click something, dogfood the services exposed in #1 to update the content dynamically. Put another way, turn the WordPress backend into a full-fledged client-side content administration application, rather than merely part of a blog. Again, nothign radical here. Gmail does this with mail, Twitter does this with Tweets. It’s time for WordPress to start doing this with posts.</p>\n</li>\n</ol>\n\n<h3 id='run'>Run</h3>\n\n<p>We may not get there tomorrow, but I know that with a bit of nuance, WordPress can align itself as the platform of the future and tackle the next generation of web-based applications in the “WordPress way”. It’s simply a matter of positioning.</p>\n\n<ol>\n<li>\n<p>Transparently map the already-exposed permalink endpoints (e.g., <code>2012/12/15/post.json</code>) to their backend counterparts. This may require a bit of rewriting of the WordPress routings system (to understand <a href='http://en.wikipedia.org/wiki/Hypertext_Transfer_Protocol#Request_methods'>HTTP verbs</a> other than simply GET). At this point, WordPress would expose a fully RESTful API for any content it knows about, but could do so with the traditional WordPress finess.</p>\n</li>\n\n<li>\n<p>Add <a href='http://backbonejs.org/'>Backbone</a> to the default theme (it’s already used on the backend), and begin to dogfood content on the front end as well as the backend so that clicking a post or page simply retrieves the content, rather than reloading the entire website. There’s an opportunity to really rethink templating here. Perhaps <code>wp_get_ajax_template</code> or something converts a WordPress template to an underscore template. Perhaps WordPress compiles everything into JST for me.</p>\n</li>\n</ol>\n\n<p>As community members sit down to sketch out what the next version of WordPress looks like, I sincerely hope they can at least think about implementing some of the front-end functionality early on, and maybe even make a prototypical wp-admin 2.0 somewhat of a priority.</p>\n\n<p>Technology has this tricky way of bringing about organizational change. Making something so dumb-simple really is an empowering force. WordPress did it once as it first set out to democratize publishing, and it’s time to do it again for the next generation of non-blogging websites and applications.</p>\n\n<p><strong>Update (12/20):</strong> <em>Not quite REST, but as <a href='https://twitter.com/scribu'>@scribu</a> points out in the comments below, <a href='https://core.trac.wordpress.org/ticket/14618'>#14618</a> proposed an RPC-like JSON API some two years ago. Looks like the ticket ended up in somewhat over a holy war over standards (XML v. JSON anyone?), but the arguments in favor still stand nonetheless.</em></p> ]]></description>\n </item><item>\n <title>We've been trained to make paper</title>\n <link>http://ben.balter.com/2012/10/19/we-ve-been-trained-to-make-paper/</link>\n <pubDate>Fri, 19 Oct 2012 00:00:00 +0000</pubDate>\n <dc:creator>Benjamin J. Balter</dc:creator>\n <category>collaboration</category>\n <category>Word</category>\n <category>workflow</category>\n <category>git</category>\n <category>GitHub</category>\n <category>markdown</category>\n <category>Technology</category>\n <guid isPermaLink=\"false\">/2012/10/19/we-ve-been-trained-to-make-paper</guid>\n <description><![CDATA[ <p>We’ve been trained wrong. We’ve been trained that content creation starts by firing up a desktop word processor — a piece of software, mind you, that still does its best to generate a digital representation of a physical piece of paper — margins, page breaks, and all. Yet this quintessential workplace-place training simply fails to remain relevant in a world where we carry a computer in our pockets at all times. Our training now tells us to create content for the least-likely way it’s consumed: on paper. We’re stuck in an anachronistic workflow.</p>\n\n<p>It’s not uncommon for example, for a team to need to write or edit a document together. Take the typical collaborative publishing process, which generally goes something like this:</p>\n\n<ol>\n<li>Draft content in Microsoft Word</li>\n\n<li>Save to shared folder or e-mail around for comments and changes</li>\n\n<li>Manually (and opaquely) reconsile changes one-by-one</li>\n\n<li>Repeat steps 2-3 until satisfied with the document</li>\n\n<li><strong>Convert to web-friendly format</strong></li>\n\n<li>Publish</li>\n</ol>\n\n<p>See what we did there? We’re still writing content for print, and then only once we’re completely done, begin to prepare it for web. That’s broken. That’s like building an entire car, and then at the last minute, deciding it should actually, in fact, be a plane. If the internet is the primary medium by which content is consumed, shouldn’t that be the primary medium for which content is written?</p>\n\n<h3 id='using_the_wrong_tools'>Using the wrong tools</h3>\n\n<p>Microsoft Word was designed with one purpose in mind: to make paper. Think about it. It’s essential elements arose in the early 80’s. There’s print-centric buttons like left and right align front and center, but new-fangled internety things like hyperlinks or rich media are buried deep inside these labyrinthian sub-menus. Sure, it’s industry standard, but it’s an industry-standard workflow that arose before the age of the web (and hasn’t really changed since).</p>\n\n<p>Yet the majority of the documents we create today rarely, if ever embody physical space. They don’t care about the things Microsoft Word cares about — margin width, page breaks, or other properties that assume four sharp corners — and more importantly, they don’t handle mobile responsiveness, machine-readability, or other web-specific features.</p>\n\n<h3 id='merely_a_snapshot'>Merely a snapshot</h3>\n\n<p>And then there’s the problem of collaborating. I can’t count the number of times I’ve been e-mailed a document entitled <code>foo-document_2012_10_15_clean_fixed_revised_final2</code> or told that it’s “on the share drive” or asked “are you out yet?”. Without expensive software, that document’s just a snapshot in time. There’s no context. <em>What updates does this version have that weren’t in the last? Wait is this even the most recent version? Who made the last three revisions? What happened with that change I submitted - did you accept it? Why not? Can we discuss? Can two people edit it at the same time? Not to mention — I have a crazy idea — can I go off and experiment in a parallel version?</em></p>\n\n<p>Geeks solved this problem a long time ago. It’s called version control. We built it in the 70’s. We start with content, you edit it, I edit it, and we get better content. It’s really that simple, and better yet, it’s free. It handles curating the master copy, keeps track of each and every change (down to the character mind you), and even provides a transparent forum to discuss those changes in the context in which they occur. <a href='https://github.com/benbalter/benbalter.github.com/commits/master/posts/_posts/2012-10-19-we-ve-been-trained-to-make-paper.md'>Take a look for yourself</a>.</p>\n\n<h3 id='jailbreaking_content'>Jailbreaking content</h3>\n\n<p>So why doesn’t everyone use this “version control”? Because we’re trained wrong. We’ve got to break free of these proprietary, print-only formats. We’ve got to stop shuttling changes back-and-forth via e-mail or with obscure file names. We’ve got to unprogram ourselves for an age of print.</p>\n\n<p>And here’s why: <em>.doc files are like tiny micro-jails for our content.</em> Proprietary document formats tend to commingle the text we provide with the commands the software needs to recreate it, and it stores all this in a complicated and inaccessible binary format. That’s what makes it proprietary. We put text in — one of the most basic things computers understand — and we get this big mess back that can only be opened by that software. Imagine if the most common way to get water was to buy a can of Coke and run in through a Brita filter. It doesn’t need to be so complicated.</p>\n\n<h3 id='break_the_habit'>Break the habit</h3>\n\n<p>Let’s just concentrate on what matters: the content. When you separate design from content, things get a lot cleaner and a lot easier to work with. From now on, instead of clicking the little blue “W” out of habit, ask “does this really need to be a piece of paper?” If not, all of a sudden you can now use the best collaboration tools that mankind has made to date, rather than publishing tools that were made for a bygone generation.</p>\n\n<p>And it’s not that hard. You can just click “edit” below (as an example), or the next project that comes across your plate, give git a try:</p>\n\n<ol>\n<li><a href='https://gist.github.com/3914310'>Learn</a> Markdown - it takes 30 seconds. Honestly.</li>\n\n<li><a href='https://github.com/signup/free'>Signup</a> for a GitHub account - it’s free!</li>\n\n<li>Install <a href='http://mac.github.com/'>GitHub for Mac</a> (or <a href='http://windows.github.com/'>GitHub for Windows</a>) and <a href='http://mouapp.com/'>Mou</a></li>\n\n<li>Create a repository and go to work</li>\n</ol>\n\n<p>Granted some of the tools can be a bit rough around the edges at times, they are getting better, and like lots of other open-source technologies before it, as we move from paper-first to a web-only distribution, the time is ripe for a more evolved, text-centric, distributed workflow to become mainstream. <em>Stop making paper, start collaborating.</em></p> ]]></description>\n </item><item>\n <title>Open Source is not a verb</title>\n <link>http://ben.balter.com/2012/10/15/open-source-is-not-a-verb/</link>\n <pubDate>Mon, 15 Oct 2012 00:00:00 +0000</pubDate>\n <dc:creator>Benjamin J. Balter</dc:creator>\n <category>open source</category>\n <category>collaboration</category>\n <category>proprietary</category>\n <category>community building</category>\n <category>Technology</category>\n <guid isPermaLink=\"false\">/2012/10/15/open-source-is-not-a-verb</guid>\n <description><![CDATA[ <p>I’m always intrigued by developers who use the term “open source” as a verb. As if a switch could magically be thrown, and via a quick mouse click in the <a href='http://www.youtube.com/watch?v=V8rZWw9HE7o'>Danger Zone</a>, a proprietary or purpose-built project quickly morphs into one that’s “open source”.</p>\n\n<p>Open source is not simply about publishing code. That’d be like saying democracy’s simply about the ability to vote. Sure, you can vote, but if your vote doesn’t matter because the act is solely symbolic, it’s not really democracy. It’s just a ruse. Like publishing code, voting is necessary but not sufficient.</p>\n\n<p>Open source, at its core, is actually not about code, but about connecting people around a shared vision. It’s about community building. It’s about collaboration. It’s about getting a bunch of enthusiastic, like-minded folks in a metaphorical room together, and giving them the resources they need to solve a shared problem and create something of benefit to others, something that none of them would have been able to do alone. It’s about building and sharing, not about publishing.</p>\n\n<p>Put another way, open source is not an alternative workflow or development method. It’s not as if you can choose between waterfall, agile, and open source means of producing software in a workplace. Instead, it’s a overriding philosophy that guides a project. Like forward thinking, simple, interoperable, system oriented, or open standards. It’s how you approach a problem from the start, not what you do after you’ve already solved it.</p>\n\n<p>To say <em>“hey, we’ve got something decent here, let’s take this closed-sourced project and just hit publish”</em> misses the mark. Your motivation can’t be to seek free labor, as in <em>“hey, if developers want to give us their time, great, let’s put this out there and see what happens we have nothing to lose”</em>, or about sporadically seeking to garner good will from a niche community of dedicated fans. Trust me, an open source developer can smell astroturf a mile a way, and that’s exactly how far they’ll stay.</p>\n\n<p><strong>So what makes an open source project truly open source and not simply “published”?</strong></p>\n\n<ul>\n<li><strong>Shared Vision</strong> - Open source developers want to get behind a cause. Think of it as analogous to volunteering for a political campaign. They want to know what the project stands for, and where it is going. If they contribute, what will their code be used for in a six months or a year?</li>\n\n<li><strong>Clear Goals</strong> - What’s the goal of the project? What’s the roadmap look like? Do you trust the community enough to share it? Can they shape that roadmap or is it set in stone?</li>\n\n<li><strong>Active Development</strong> - When’s the last public commit? Are you commiting privately, bundling together a release and then blessing the community with your efforts or is development occuring in the open?</li>\n\n<li><strong>Us/Them Mentality</strong> - Is there a class system between paid/unpaid contributors? Are outside contributions handled with equal footing? Are any outside developers delegated authority or given commit access?</li>\n\n<li><strong>Mechanics</strong> - Is it in version control or just a static download? Is the bug tracker public? Can I comment and submit? What about documentation? Is it maintained in a wiki?</li>\n\n<li><strong>Communication</strong> - Can developers communicate directly or must they go through the parent organization? (e.g., announcement verses conversation models)</li>\n\n<li><strong>Purpose-built Code</strong> - Is the code writen for open source? Is it sufficiently documented? Is it modular? Is it specific to the initial usecase or abstracted out to the underlying logic?</li>\n</ul>\n\n<p>All of the above are underlying principles that drive development from day one, and yet not incompatible with a philosophy that dictates code remains under lock and key until a minimum viable product (MVP) has been reached. They do remain incompatible, however, with a philosophy that says that business as usual can be easily switched mid-stream to a successful open source project by simply not keeping the code secret.</p>\n\n<p>In the end, it’s about <a href='http://ben.balter.com/open-source-for-government/#open_source_community_building'>developing a community</a>, not about developing software. You’re selling an experience — whether it’s scratching a developer’s personal itch or giving them the opportunity to change the world. Next time you seek to build something useful, unless it’s the recipe for your secret sauce or something so specific as to render it worthless outside the organization’s walls, consider <a href='http://ben.balter.com/2012/06/26/why-you-should-always-write-software-as-open-source/'>making it open source from the start</a>, and instead seeking to grow a vibrant community around a cause, rather than simply coding a piece of software that happens to not be secret.</p> ]]></description>\n </item><item>\n <title>Open Source for Government</title>\n <link>http://ben.balter.com/2012/10/09/open-source-for-government/</link>\n <pubDate>Tue, 09 Oct 2012 00:00:00 +0000</pubDate>\n <dc:creator>Benjamin J. Balter</dc:creator>\n <category>open source</category>\n <category>collaboration</category>\n <category>government</category>\n <category>gov20</category>\n <category>gov 2.0</category>\n <category>community building</category>\n <category>how to</category>\n <category>Technology</category>\n <guid isPermaLink=\"false\">/2012/10/09/open-source-for-government</guid>\n <description><![CDATA[ <p>I encourage you to read through <a href='http://ben.balter.com/open-source-for-government/'>Open Source for Government</a>, a collaborative resource for government employees looking to participate in the open source community.</p>\n\n<p>Also please feel free to <a href='https://github.com/benbalter/open-source-for-government'>fork and contribute</a> (no technical knowledge necessary).</p> ]]></description>\n </item><item>\n <title>Welcome to the Post-CMS World</title>\n <link>http://ben.balter.com/2012/10/01/welcome-to-the-post-cms-world/</link>\n <pubDate>Mon, 01 Oct 2012 00:00:00 +0000</pubDate>\n <dc:creator>Benjamin J. Balter</dc:creator>\n <category>WordPress</category>\n <category>Jekyll</category>\n <category>GitHub</category>\n <category>benchmarking</category>\n <category>benchmarks</category>\n <category>Technology</category>\n <guid isPermaLink=\"false\">/2012/10/01/welcome-to-the-post-cms-world</guid>\n <description><![CDATA[ <p>You may notice things are bit snappier around here these days, having <a href='https://github.com/benbalter/wordpress-to-jekyll-exporter'>recently converted</a> the site from WordPress, to <a href='https://github.com/mojombo/jekyll'>Jekyll</a>.<sup id='fnref:1'><a href='#fn:1' rel='footnote'>1</a></sup></p>\n\n<p>Jekyll is a blog-aware static site generator — heavily integrated with the social code sharing service GitHub — the move to which, was primarily motivated by a desire to embrace the brave new, <a href='http://developmentseed.org/blog/2012/07/27/build-cms-free-websites/'>post-CMS world</a> we now find ourselves in. While WordPress is great, <a href='http://cl.ly/image/1M420a152e1z'>130 outages over the past six months (totalling more than a day’s worth of downtime)</a>, left a bit to be desired in terms of hosting.</p>\n\n<p>Although powered by the open-source CMS WordPress, the old site (shared hosting provided by Bluehost) for performance sake, would actually just served flat HTML and Javscript files from disk (generated on a regular basis by an industry-standard plugin known as <a href='http://wordpress.org/extend/plugins/w3-total-cache/'>W3 Total Cache</a>), but fired up WordPress on every request (on top of the already slugish Apache).</p>\n\n<p>Don’t get me wrong. WordPress can be <a href='http://wordpress.org/extend/plugins/batcache/'>configured to fly</a> given the right setup, and that’s exactly what I set out to do. I got the best of the best. I spun up a shiny new AWS box, got Nginx with microcache up and running, APC for opcode, page, and object cache, and even put everything behind Varnish.</p>\n\n<p>But as much as it pains the developer in me, just like fixies, PBR, and Javascript, static sites are back in style. Reduce the complexity, push it to the edge, and let the visitor’s browser call APIs directly to generate any dynamic content you may need. Same functionality, same experience, no headache.</p>\n\n<p>The pitch is straightforward. It leads to simple, flexible, and reliable websites that allow for a renewed focus on what actually matters: the content. Dave Cole over at <a href='http://developmentseed.org/'>Development Seed</a> (also powered by Jekyll) <a href='http://developmentseed.org/blog/2012/07/27/build-cms-free-websites/'>put it best</a>:</p>\n\n<blockquote>\n<p>In the past, building websites with features like consistent templates and lists of aggregated content meant setting up complex content management systems. These CMSs consisted of templating logic, application code, and content databases so they could assemble webpages each time they were requested by site visitors. They were complicated systems that depend on many separate applications working together, like a web server to route page requests to a PHP application that uses pre-defined page layout templates to format content that’s stored in a MySQL database. Serving a page request required at least three separate applications all working together — any one failing would bring down the system…</p>\n</blockquote>\n\n<blockquote>\n<p>From open source frameworks like Drupal, Wordpress, and Expression Engine to multi-million dollar proprietary applications that the government and big corporations procure from companies that also build tanks and battle ships, these systems produce the same exact output: HTML, CSS, and JavaScript files that web browsers know how to format into the webpages we see. Additional features like RSS or JSON API feeds are just new templates for the same content, and backend workflow modules like those for embedded media and handling email notifications are really separate systems that introduce complexity when integrated with the publishing system.</p>\n</blockquote>\n\n<p>And then there’s cost. Putting aside the value of time for a moment, shared hosting’s going to run you in the ballpark of $7 a month; AWS starts at $14, plus the cost of bandwidth and storage; and Jekyll, if hosted by GitHub? Free.<sup id='fnref:2'><a href='#fn:2' rel='footnote'>2</a></sup></p>\n\n<p>I stood up the three options side-by-side, and ran them through the riggors of a performance testing tool humerously called <a href='http://www.joedog.org/siege-home/'>Siege</a>, the results of which can be found below.</p>\n\n<p>I’m still unpacking some of the boxes of bytes, so if you notice something that doesn’t seem right, feel free to <a href='https://github.com/benbalter/benbalter.github.com/issues'>open an issue</a>, or better yet, like what you see, feel free to <a href='https://github.com/benbalter/benbalter.github.com'>fork and contribute</a>. Embracing somewhat of a crawl, walk, run, or fail-fast philosohpy, next up is <a href='https://github.com/benbalter/benbalter.github.com/blob/js/_plugins/generate-json.rb'>outputting the pages as JSON</a> and relying on Backbone to do the heavy lifting.</p>\n\n<p>Is it the <a href='http://presidential-innovation-fellows.github.com/mygov/'>first shots</a> <a href='http://presidential-innovation-fellows.github.com/rfpez-blog/'>of a static-site</a> <a href='http://presidential-innovation-fellows.github.com/bluebutton/'>revolution</a>? Time will tell.</p>\n\n<p>The CMS is dead. Long live the CMS.</p>\n<!-- more --><hr />\n<h2 id='the_results'>The Results</h2>\n\n<p><strong>WARNING: Geek Content!</strong></p>\n\n<h3 id='homepage'>Homepage</h3>\n\n<p>Command: <code>siege -c 20 -t 30S -b ben.balter.com</code></p>\n\n<p>The first test was to benchmark the homepage, the most heavily trafficed page on the site. Given 30 seconds of continuous traffic from 20 concurrent users, Bluehost was able to serve a meager 40 users. AWS managed an impressive 2000 users during that same time period (a 50x performance improvement), and did so twice as fast. Enter Jekyll with more than 2600 users (65x increase), responding on average to each in less than a quarter of a second.</p>\n\n<h4 id='shared_hosting_bluehost'>Shared Hosting (Bluehost)</h4>\n\n<pre><code>Transactions:\t\t 40 hits\nAvailability:\t\t 100.00 %\nElapsed time:\t\t 29.54 secs\nData transferred:\t 0.68 MB\nResponse time:\t\t 0.57 secs\nTransaction rate:\t 1.35 trans/sec\nThroughput:\t\t 0.02 MB/sec\nConcurrency:\t\t 0.78\nSuccessful transactions: 40\nFailed transactions:\t 0\nLongest transaction:\t 0.71\nShortest transaction:\t 0.47</code></pre>\n\n<h4 id='varnish__microcache__page_cache__object_cache_aws'>Varnish + Microcache + Page Cache + Object Cache (AWS)</h4>\n\n<pre><code>Transactions:\t\t 1954 hits\nAvailability:\t\t 100.00 %\nElapsed time:\t\t 29.39 secs\nData transferred:\t 13.63 MB\nResponse time:\t\t 0.30 secs\nTransaction rate:\t 66.49 trans/sec\nThroughput:\t\t 0.46 MB/sec\nConcurrency:\t\t 19.80\nSuccessful transactions: 1954\nFailed transactions:\t 0\nLongest transaction:\t 0.92\nShortest transaction:\t 0.06</code></pre>\n\n<h4 id='github_pages'>Github Pages</h4>\n\n<pre><code>Transactions:\t\t 2629 hits\nAvailability:\t\t 100.00 %\nElapsed time:\t\t 29.42 secs\nData transferred:\t 2.71 MB\nResponse time:\t\t 0.22 secs\nTransaction rate:\t 89.36 trans/sec\nThroughput:\t\t 0.09 MB/sec\nConcurrency:\t\t 19.86\nSuccessful transactions: 2629\nFailed transactions:\t 0\nLongest transaction:\t 1.38\nShortest transaction:\t 0.06</code></pre>\n\n<h3 id='404s'>404s</h3>\n\n<p>Command: <code>siege -c 20 -t 30S -b ben.balter.com/aaaaaaa/</code></p>\n\n<p>The true challenge comes in not from serving a static front page (which is presumably cached by WordPress after the first request), but in what happens when it has to reach into the database to retrieve content, for example, when processing a page that doesn’t exist.<sup id='fnref:3'><a href='#fn:3' rel='footnote'>3</a></sup> Bluehost squeezed out a single response each second, AWS just over 50, and Jekyll didn’t flinch at 80.</p>\n\n<h4 id='shared_hosting_bluehost'>Shared Hosting (Bluehost)</h4>\n\n<pre><code>Transactions:\t\t 30 hits\nAvailability:\t\t 21.43 %\nElapsed time:\t\t 29.58 secs\nData transferred:\t 0.19 MB\nResponse time:\t\t 14.93 secs\nTransaction rate:\t 1.01 trans/sec\nThroughput:\t\t 0.01 MB/sec\nConcurrency:\t\t 15.14\nSuccessful transactions: 0\nFailed transactions:\t 110\nLongest transaction:\t 22.88\nShortest transaction:\t 0.00</code></pre>\n\n<h4 id='varnish__microcache__page_cache__object_cache_aws'>Varnish + Microcache + Page Cache + Object Cache (AWS)</h4>\n\n<pre><code>Transactions:\t\t 1567 hits\nAvailability:\t\t 100.00 %\nElapsed time:\t\t 29.13 secs\nData transferred:\t 14.71 MB\nResponse time:\t\t 0.37 secs\nTransaction rate:\t 53.79 trans/sec\nThroughput:\t\t 0.50 MB/sec\nConcurrency:\t\t 19.83\nSuccessful transactions: 0\nFailed transactions:\t 0\nLongest transaction:\t 1.13\nShortest transaction:\t 0.00</code></pre>\n\n<h4 id='github_pages'>Github Pages</h4>\n\n<pre><code>Transactions:\t\t 2373 hits\nAvailability:\t\t 100.00 %\nElapsed time:\t\t 29.82 secs\nData transferred:\t 10.48 MB\nResponse time:\t\t 0.25 secs\nTransaction rate:\t 79.58 trans/sec\nThroughput:\t\t 0.35 MB/sec\nConcurrency:\t\t 19.92\nSuccessful transactions: 0\nFailed transactions:\t 0\nLongest transaction:\t 1.42\nShortest transaction:\t 0.00</code></pre>\n\n<h3 id='uptime'>Uptime</h3>\n\n<p>The other concern was uptime. With the AWS route you may get the performance, but with all that complexity, it’s increasingly more like that something would go wrong, harder to diagnose and resolve, and unlike shared or managed hosting, if your site goes down at 3:00 am, the only person to call is yourself. (no thanks.)</p>\n\n<p>With Jekyll, because the files are simply sitting on the server, absent a catastrophic failure, when things go wrong with Jekyll, it simply keeps serving the old content.<sup id='fnref:4'><a href='#fn:4' rel='footnote'>4</a></sup></p>\n\n<h2 id='conclusion'>Conclusion</h2>\n\n<p>It’s cheaper, it’s faster, it’s simpler, it’s worry free, and in my opinion, it’s the future. Welcome to the post-CMS world.</p>\n<div class='footnotes'><hr /><ol><li id='fn:1'>\n<p>Not to be confused with <a href='http://www.youtube.com/watch?v=Q7H_L5cYkg8'>The Jackal</a>.</p>\n<a href='#fnref:1' rev='footnote'>↩</a></li><li id='fn:2'>\n<p>That’s free as in speech <strong>and</strong> free as in beer.</p>\n<a href='#fnref:2' rev='footnote'>↩</a></li><li id='fn:3'>\n<p>Requesting a page that doesn’t exist will require WordPress to run multiple database queries to attempt to find the page, a request that would most likely not be cached in the event that the 404 was sent in error.</p>\n<a href='#fnref:3' rev='footnote'>↩</a></li><li id='fn:4'>\n<p>GitHub’s build queue has been backing up every once in a while as of late, but if a change isn’t instantanous, I’m okay with that.</p>\n<a href='#fnref:4' rev='footnote'>↩</a></li></ol></div> ]]></description>\n </item><item>\n <title>Government's Release of Federally Funded Source Code: Public Domain or Open Source? Yes.</title>\n <link>http://ben.balter.com/2012/07/26/government-release-of-source-code-public-domain-or-open-source/</link>\n <pubDate>Thu, 26 Jul 2012 00:00:00 +0000</pubDate>\n <dc:creator>Benjamin J. Balter</dc:creator>\n <category>.govs</category>\n <category>code</category>\n <category>contracting</category>\n <category>copyright</category>\n <category>enterprise</category>\n <category>federal</category>\n <category>gov 2.0</category>\n <category>government</category>\n <category>gpl</category>\n <category>IT</category>\n <category>licensing</category>\n <category>open government</category>\n <category>open source</category>\n <category>procurement</category>\n <category>Law</category>\n <category>Technology</category>\n <guid isPermaLink=\"false\">/2012/07/26/government-release-of-source-code-public-domain-or-open-source</guid>\n <description><![CDATA[ <p>A petition was recently posted on <a href='https://petitions.whitehouse.gov/'>We The People</a> demanding <a href='https://petitions.whitehouse.gov/petition/maximize-public-benefit-federal-technology-sharing-government-developed-software-under-open-source/6n5ZBBwf?utm_source=wh.gov&utm_medium=shorturl&utm_campaign=shorturl'>that federally funded software be released under an open source license</a>. Makes sense. The public should have access to what is technically their property.</p>\n\n<p>However, <a href='http://www.techdirt.com/articles/20120723/12181319800/should-software-created-federal-govt-be-open-source-licensed-public-domain.shtml'>TechDirt posed the question</a> of whether it should be released under an open-source license or public domain, and I’m afraid they really missed the point.</p>\n\n<p>There’s no doubt in my mind that the creator of the petition was simply asking the question “I can haz source code?” Plain and simple. Put it in context: 99% of the time when an organization (or an individual) releases software to the public, they do so under the terms of an open source license. It tells users what they can and can’t do, and tells contributors under what terms they can contribute. It’s set’s the ground rules. It’s a contract with the public. It’s a prenup for code.</p>\n\n<p>So what’s the issue? Although I generally dread the phrase, in this case, government is objectively different. Under 17 U.S.C § 105 US Government Works are not subject to domestic copyright protection. It’s not technically public domain, but it’s close enough. <sup id='fnref:1'><a href='#fn:1' rel='footnote'>1</a></sup> Any US citizen can use the code any way they wish. There’s simply no copyright, thus no need to license. <sup id='fnref:2'><a href='#fn:2' rel='footnote'>2</a></sup> And this entire debate is a moot point if the software is a derivative work of a viral license like the GPL, the most common open source license. <sup id='fnref:3'><a href='#fn:3' rel='footnote'>3</a></sup></p>\n\n<p>That of course, only applies to code created by a US government employee, an increasingly rare occurrence. <sup id='fnref:4'><a href='#fn:4' rel='footnote'>4</a></sup> Absent permission from the contracting officer, the US government retains unlimited rights for all work created under contract (including the right to redistribute). <sup id='fnref:5'><a href='#fn:5' rel='footnote'>5</a></sup> And again a moot point if GPL derivative (and thus must be given to the Government under the GPL.)</p>\n\n<p>Yet all this is very academic (not to mention dry). Waldo Jaquith and Anil Dash <a href='https://twitter.com/anildash/statuses/227476701599391744'>made a great suggestion</a>: let’s be pragmatic here. Government doesn’t hold on to software because they are concerned about licensing. They hold on to software because they have better things to do, because it’s not within the culture, and because there’s no angry mob slamming a battering ram against the metaphorical front gates when they don’t.</p>\n\n<p>I don’t think the nuances of federal procurement law is even close to the first thing we should care about here. <sup id='fnref:6'><a href='#fn:6' rel='footnote'>6</a></sup> The concern is about whether feds should do the leg work to open source it or not. The question for us as developers, for the thought leaders in the space, isn’t how should the US government best license / not license software, but <em>how can the open source community help it to do so.</em> How can we get more software out the door? In a world of finite time, <em>how can we make open sourcing</em> <sup id='fnref:7'><a href='#fn:7' rel='footnote'>7</a></sup> <em>a bonafide priority</em>?</p>\n\n<p>How? For one, involvement in existing open source projects <sup id='fnref:8'><a href='#fn:8' rel='footnote'>8</a></sup> would surely send a strong message that there’s latent demand here, and would give the foot soldiers political air cover to forge onward with their efforts. For another, taking ownership of the code itself, and realizing it is <em>our</em> code, not the government’s would surely change the tone of the debate by encouraging agencies to ship code sooner, rather than delaying release out of fear of criticism.</p>\n\n<p>Put simply, it’s about what role we are going to play, not what rights we are going to receive. Let’s at least get the source code, then we can go back to our regularly scheduled holy wars over licensing.</p>\n\n<p><em>As always, <a href='http://ben.balter.com/fine-print/'>views are my own</a>.</em></p>\n<div class='footnotes'><hr /><ol><li id='fn:1'>\n<p>I’d argue that all software, even government funded software should still be licensed under a traditional open source license, to resolve any legal ambiguity when used abroad under the terms of various international copyright treaties and agreements</p>\n<a href='#fnref:1' rev='footnote'>↩</a></li><li id='fn:2'>\n<p>Although citizen-contributions to that project would theoretically not be public domain, thus necessitating a license, which should be clarified in the project’s documentation at the time of release to avoid potential issues with 21 U.S.C. § 1342.</p>\n<a href='#fnref:2' rev='footnote'>↩</a></li><li id='fn:3'>\n<p>Although again, technically speaking the project as a whole would be licensed under GPL, individual code not dependent on the parent project could be used as a US Government Work.</p>\n<a href='#fnref:3' rev='footnote'>↩</a></li><li id='fn:4'>\n<p>Unless you’re looking at the <a href='https://github.com/languages/ColdFusion'>vibrant open source cold fusion community</a>.</p>\n<a href='#fnref:4' rev='footnote'>↩</a></li><li id='fn:5'>\n<p>FAR 52.227-14(c)(1)(i). Even if the contracting officer grants such rights, they do not take effect unless the contractor includes a copyright notice at the time of delivery, acknowledging the government’s sponsorship and indicating the contract number under which it was procured. See FAR 27.404(a)(5).</p>\n<a href='#fnref:5' rev='footnote'>↩</a></li><li id='fn:6'>\n<p>General counsels across government already have enough ammunition to stymy progress.</p>\n<a href='#fnref:6' rev='footnote'>↩</a></li><li id='fn:7'>\n<p>Often the last and least seen step in the enterprise development process.</p>\n<a href='#fnref:7' rev='footnote'>↩</a></li><li id='fn:8'>\n<p>There’s been <a href='http://ben.balter.com/2012/04/15/cfpb-accepts-first-citizen-submitted-pull-request-on-behalf-of-federal-government/'>exactly one pull request to date</a> across all government github repos.</p>\n<a href='#fnref:8' rev='footnote'>↩</a></li></ol></div> ]]></description>\n </item><item>\n <title>The Demise of the Personal Dashboard</title>\n <link>http://ben.balter.com/2012/07/10/the-demise-of-the-dashboard/</link>\n <pubDate>Tue, 10 Jul 2012 00:00:00 +0000</pubDate>\n <dc:creator>Benjamin J. Balter</dc:creator>\n <category>.govs</category>\n <category>analytics</category>\n <category>dashboard</category>\n <category>enterprise</category>\n <category>gov 2.0</category>\n <category>government</category>\n <category>KPIs</category>\n <category>start ups</category>\n <category>ui</category>\n <category>ui/ux</category>\n <category>ux</category>\n <category>wordpress</category>\n <category>Technology</category>\n <guid isPermaLink=\"false\">/2012/07/10/the-demise-of-the-dashboard</guid>\n <description><![CDATA[ <p><a href='http://ben.balter.com/wp-content/uploads/2012/07/dashboard-all-the-things.jpeg'><img alt='Dashboard all the things' class='alignright' src='http://ben.balter.com/wp-content/uploads/2012/07/dashboard-all-the-things-300x225.jpeg' /></a></p>\n\n<p>I was recently asked how I would architect a personalized dashboard experience for visitors to a large, customer-facing website. My response? <em>I wouldn’t.</em></p>\n\n<p>A dashboard in a car or airplane makes sense. It’s not as if I could click “speedometer” while driving or press the “altimeter” button while flying. I simply need everything at all times. But virtual interfaces don’t have that same limitation. In fact they don’t have any limitations. A dashboard can have as much information as the most ambitious engineer can dream — and that’s exactly the problem.</p>\n\n<p>Put it in context: Google <a href='http://googleblog.blogspot.com/2012/07/spring-cleaning-in-summer.html'>recently announced the retirement of iGoogle</a>, it’s own personalized dashboard, and I second their nomination to induct dashboards into the #doingitwrong hall of fame, joining the likes of internet portals, splash pages, and well, basically anything involving ActiveX or Flash.</p>\n\n<p>Dashboard were a fun user interface experiment. They really were, especially compared to the static pages they evolved from. That was the whole point of Web 2.0, wasn’t it? Personalization? I mean, it was really cool to drag and drop widgets, and build a virtual command center to monitor my little corner of the internet, and that was fine when there wasn’t much internet out there to monitor. But the web collectively hit a tipping point a few years back. From push notifications to always-on e-mail, in more ways than we imagine, we now bombard ourselves with more information that we can physically process at any given moment. <a href='http://www.apple.com/iphone/features/retina-display.html'>Quite literally</a>.</p>\n\n<p>Think about it this way: when customers come to a website, they’re not looking to solve 10 problems. They’re looking to solve one. They don’t want all the potentially relevant information thrown at them all at once; they just want what they need. And they want computers to make that determination for them. But hey, this isn’t the first time those who predict our user experience needs have erred on the side of <a href='http://www.pocket-lint.com/images/dynamic/NEWS-32125-b3a8b509bc5e3a074f7f240f57d71aa9.jpg'>moar is better</a>.</p>\n\n<p>So that’s it? That’s the end of simultaneous streams? <a href='http://www.informationweek.com/news/software/productivity_apps/240003296'>Far from it</a>. This once-disruptive technology now has a long journey down the Technology S Curve as it becomes the go-to solution for all the business intelligence and project analyst types that stumble across it, in other words, the late adopters.</p>\n\n<p>Don’t get me wrong. I’m sure guilty of building <a href='http://my.fcc.gov/'>a dashboard</a> or <a href='http://codex.wordpress.org/Dashboard_Screen'>two</a> in my day. I’m not saying that they’ve never had a place. What I’m saying is that today, not even the most complex dashboard could give you an accurate snapshot of its genus’s future. If not dashes, then what? Beyond turning everything into a ubiquitous search box (<em>a la</em> <a href='http://gov.uk'>gov.uk</a>), I’m far from for a UI/UX expert, but I tend to think that startups generally have a pretty good sense of what’s next. They have to. If they don’t get it right the first time around, they tend not to have a second try. So what do we see?</p>\n\n<ul>\n<li>\n<p><strong>Activity -</strong> Social apps like Facebook, Twitter, Foursquare, even GitHub are all built around the concept of activity. Whether its a news feed, recent checkins, or even commit activity, the question I come with is “what’s going on?” and it gets answered as in depth as I care to scroll through, not as in depth as an engineer arbitrarily decided I needed a few years back. It’s linear. It’s <a href='http://en.wikipedia.org/wiki/Inverted_pyramid'>inverted pyramid</a>. It’s customized by whom or what I follow, not by what I add or (re)arrange.</p>\n</li>\n\n<li>\n<p><strong>Minimal</strong> – Productivity apps like Gmail, Google Reader, even dropbox don’t summarize for me how many e-mails, unread posts, or free MB I have as soon as I log in, and with the exception of a few labs features here or there, don’t even give me the option to have anything more than a bare-bones inbox, unread feed, or directory listing. In fact, GMail and Google Reader were recently criticized for <a href='http://jonoscript.wordpress.com/2012/04/26/gmail-designer-arrogance-and-the-cult-of-minimalism/'>going a bit too far</a> in this direction. But the lesson is the same: just give me my stuff and get out of the way.</p>\n</li>\n\n<li>\n<p><strong>Immediate</strong> - Transactional apps, like Uber or Square focus on action, not the past (or even the present). When I open the Uber or square apps, I’m immediately presented with the ability to request a vehicle or swipe a card, not my top tasks, not an arbitrary array of options or metrics, not with recent news about the product or popular add-ons. The app simply stands at attention, awaiting orders. I actually had to dig a bit to find my transaction history and related business analytics, and I’d argue that’s a really good thing.</p>\n</li>\n</ul>\n\n<p>Think about the last time you’ve used a drag-and-drop dashboard: If you’re like me, it’s going to be either Google Analytics or WordPress, and if that’s the case, it’s simply known as <em>the screen you see after you log in, but before you can do what you need to do</em>. It’s wasted pixels. It’s cruft from a bygone era when clicks were expensive and developers were left wondering “how can we fit more on a page”.</p>\n\n<p>Options are a crutch. It’s the natural tendency of any engineer to over engineer a system, and that tendency is even stronger in a risk-averse, top-down culture <a href='http://www.google.com/?q=dashboard+site:.gov'>like government</a>. But your job — as an engineer, as a product manager, as user — is to push back, to fight that urge, to make <a href='http://wordpress.org/about/philosophy/'>decisions, not options</a>. Not convinced? That feature you can’t <a href='https://github.com/blog/1091-spring-cleaning'>bring yourself to cut</a>? Expose it through your API and see how many users complain.</p>\n\n<p>It’s no longer a question of “is this possible technologically?”. It’s no longer a matter of “can you expose me to that information 24/7?”. Ever since the advent of <a href='http://html5zombo.com/'>Zombo com</a>, the only limit is our imagination. We’ve figured out the hard stuff. It’s not centralization and personalization. It’s decentralization and interoperability. Simplicity is the new black.</p> ]]></description>\n </item><item>\n <title>Why You Should Always Write Software as Open Source, Even When It's Never Going to Be</title>\n <link>http://ben.balter.com/2012/06/26/why-you-should-always-write-software-as-open-source/</link>\n <pubDate>Tue, 26 Jun 2012 00:00:00 +0000</pubDate>\n <dc:creator>Benjamin J. Balter</dc:creator>\n <category>.govs</category>\n <category>agile</category>\n <category>code</category>\n <category>contracting</category>\n <category>development</category>\n <category>enterprise</category>\n <category>gov 2.0</category>\n <category>government</category>\n <category>IT</category>\n <category>open source</category>\n <category>procurement</category>\n <category>proprietary</category>\n <category>Business</category>\n <category>Technology</category>\n <guid isPermaLink=\"false\">/2012/06/26/why-you-should-always-write-software-as-open-source</guid>\n <description><![CDATA[ <p><a href='http://ben.balter.com/wp-content/uploads/2012/06/mike-holmes.jpeg'><img alt='Unsatisfied with your Contractor?' class='alignright' src='http://ben.balter.com/wp-content/uploads/2012/06/mike-holmes-203x300.jpeg' /></a></p>\n\n<p>There are two kinds of software: cludgy software and open source. Think about it logically. When you (or your organization) is the only person that’s ever going to see something, you’re a lot more likely to “just make it work.” After all, who would ever know? <sup id='fnref:1'><a href='#fn:1' rel='footnote'>1</a></sup></p>\n\n<p>But the same logic that applies to sweeping literal dirt under the rug doesn’t apply to writing code. Whereas a rug will always serve to cover the floor, applications evolve over time and code is often constantly reused and repurposed as customers’ needs change. Simply put, it’s impossible to predict today where your code is going to be a year from now and it’s in your best interest to plan accordingly.</p>\n\n<p>Open source hedges this risk by distinguishing generic logic (say posting content online) from application-specific customization (say the use-case-specific presentation of that content). Yet when you’re writing with the intention of producing proprietary or one-off code, you do everything in one pass. The true challenge arises when the same problem emerges again in another department, another business unit, or more generally in an even slightly different context. You’re reinventing the wheel. You’re “open sourcing” (even if within your organization). The solution? Always assume your software is going to be open source, even if you know it’s never going to be, and here’s why:</p>\n\n<p><strong>Flexible from the start</strong> - Imagine you building a house and the contractor literally nails down all your furniture at the onset, saying you could always remove it before you sell. You’d almost certainly hire a new contractor. Even if you’re never going to sell the house, you may want to get a new couch, or at the very least change a room’s layout somewhere down the line. Yet software developers do it all the time. We custom build solutions, and then go back and abstract logic to “open source” it as needed. You’re doubling the effort. Keep logic separate from implementation-specific customization, and you’ll have a shared, portable solution from day one. Put another way, your business unit is no way special or unique. The same logic that presents updates about the latest line of widgets to your customers can also be used to update the same customer base about cogs and you should prepare for that potential synergy from day one, even if not immediately realized.</p>\n\n<p><strong>Modular by design</strong>- Distinguishing unrelated components encourages several coding best practices. In addition to introducing a modular design, meaning additional components could easily be added (or existing components removed) down the line, abstraction often yields objectively more stable and more readably maintainable code due to the abhorrence of the copy-and-paste effect. Put another way, you’re forced to build elegant solutions — the fact that others are not only going to see, but have to be able to use and adapt your code forces you to follow best-practices like name spacing, abstraction, and object oriented programming.</p>\n\n<p><strong>A message to your future self</strong> – Ever go back and look at old code, <a href='https://twitter.com/BenBalter/status/209356982983999488'>only to scratch your head</a> as to what’s going on? The same you that may be asking yourself what you were thinking when you got a tattoo five years back, is also going to be asking why you wrote that singleton function five years ago. Yet when you write open source, you mitigate that risk by explaining your code in such a way that others (including your future self) can understand it. In a world of system orientated architectures and ever-changing requirements, the chance that a software project is one-and-done is increasingly rare, not to mention the fact that by failing to properly document, you’re introducing a significant risk of vendor lock in. Your successor will thank you, and so will the person paying the bills.</p>\n\n<p>The reality of today’s business environment is that all software is inherently “open source”, even if the scope of the sharing is limited to an organization. Assume the software is open, needs to be modular, and will be repurposed, and you will save significant costs in the long run. And when you require the same of outside contractors, you get better, more flexible code, and offset the risks of vendor or technology lock in in the long run.</p>\n\n<p>Justice Brandeis is famous for noting that “sunlight is the best disinfectant.” Likewise, the transparency afforded by the open-source ethos produces <a href='http://www.coverity.com/library/pdf/coverity-scan-2011-open-source-integrity-report.pdf'>more reliable software</a> – so why not simply assume your code is going to be open source from the start?</p>\n<div class='footnotes'><hr /><ol><li id='fn:1'>\n<p>The same would apply when you’re buying software and the contractor is under the impression no one outside the organization will ever see the code, and more importantly, the code could never negatively impact the public’s perception of their overall work-product</p>\n<a href='#fnref:1' rev='footnote'>↩</a></li></ol></div> ]]></description>\n </item>\n</channel>\n</rss>"}