Hacker News new | past | comments | ask | show | jobs | submit login
Barack Obama Directs All Federal Agencies to Have an API (apievangelist.com)
298 points by mcrider on June 1, 2012 | hide | past | favorite | 77 comments



I'm reminded of this recent post by James Fee, talking about geodata, but I think it applies to the general case:

    "... The question was APIs or downloads... 
     Personally, I believe [data] is one of the best 
     ways for citizens to keep track of their government 
     (local to federal) ... APIs tend to deliver what 
     their “owners” want them to do. Raw data means 
     everyone has an opportunity to check each other’s 
     work. Of course, raw data can be manipulated as 
     well, but it is harder to obscure."
- http://spatiallyadjusted.com/2012/04/03/sharing-data-downloa...

I couldn't agree more. APIs are great, but are not the key to open government, for two reasons:

1. They don't provide simple and easy access for non technical individuals into raw information.

APIs shouldn't exist for querying historical datasets if the dataset is not already available in a static format. Release the data, then build an API if there is demand (or the private sector doesn't do it, better, for you).

2. Historical data access is poorly served by APIs.

There is no such thing as a good 'general use' API[1]. API's are appropriate for specific service based transactions that involve some level of processing. Examples:

    * VAT/GST number validation
    * Road closure notifications
    * Identity services
3. Bonus reason: government agencies suck at building APIs.

They're not good at determining what is genuinely high value to end users, they tend to prefer visible projects that can justify budget increases, over genuinely useful, but less easily communicated ones (cf. the US national highway system and pork barrel politics), and there is an entire industry of enterprise companies heavily invested in keeping it this way.

TL;DR Release the data, let users build the APIs. Everyone wins.

Bootnotes:

[1] I lie. That's exactly what publishing raw data at stable URLS on a website achieves.


The exciting thing about APIs isn't what they directly do for non-technical individuals.

Instead, a good API makes government into a platform for free (and paid) services to be built to deliver that data in innovative ways. Examples of this are starting to appear in places like Chicago which has opened up a lot of data access – for things including transit (bus tracking, etc.) and a lot more. Giving hackers platforms to innovate will definitely yield better results than just throwing gobs of data at the general public. (Never mind that not all raw data is created equal or that raw data also requires savvy people to distill.

It also means that it's potentially going to be easier for one unit of government to interact with (or at least query) another. That may be big as well.


A couple of points.

Whenever you put the word 'government' and 'innovation' into the same sentence, you need to check your working. Government is, by and large, terrible at technical innovation. And innovation by mandate will, predictable, be a non-starter.

That said, there are areas where it is appropriate - see my earlier examples (tranit bus feeds are another one) - where the relevant government agency has found a niche that (a) only they can provide the data, and (b) it is a very easily defined problem that they're solving.

Finally, I really take issue with this statement:

> "giving hackers platforms to innovate will definitely yield better results than just throwing gobs of data at the general public."

Utter rubbish. The "general public" is not stupid. The general public includes huge numbers of people with the domain knowledge and wherewithal to analyse the data. To claim that open and transparent government is better served through an elite technical class, instead of directly to individuals is simply false.


Perhaps I'm misinterpreting your use of the term technical innovation, but weren't the Manhattan Project, NASA and many military projects innovation by mandate that were extremely successful?


It's quite one thing to form agencies with specific science and engeering goals (Manhatten Project, NASA) and populating them with experts and billions of dollars, quite another to expect the same results from other departments without that background.

And even then: the progress that SpaceX has made in the last few years is an almost text-book proof of my point. NASA provided SpaceX with a wealth of knowledge and experience that has been very poorly utilized (dollar for dollar) over the last 20 years. Within a very short time-frame we've suddenly seen innovation in the space sector like we haven't seen since the Apollo programme.

As for the military sector - given that maintaining a monopoly over the application of military force is (fundamentally) the primary function of government, it's not surprising that this is the one area of government that has understood it's place for a very long time: issuing contracts with stated operational aims and leaving the private sector to provide the innovation. Yes, the system is flawed - the F-35 programme is a bit of a disaster - but even then it can be at least partially blamed on government agencies interfering with the procurement process.


Yes, but it doesn't take lots of experts and billions of dollars to build an API, I would think.

And yes, we've seen tremendous innovation from SpaceX - but where would they be without that knowledge and experience from NASA? Just because NASA has been turned into a bureaucratic mess with funding that bounces around doesn't mean they haven't created a lot of value and new knowledge, most of which would never have been funded by the private sector. Government work isn't about having good returns on money spent, which strikes me as both a blessing and a curse.

I wonder how many current day innovations have sprung from the initial work of places like the DOE National Labs, and how many might in the future, for all that it's a government program.


> Yes, but it doesn't take lots of experts and billions of dollars to build an API, I would think.

Refer (3) in my original top post. This is something they would prove remarkably adept at achieving.

> ...but where would they be without that knowledge and experience from NASA?

Exactly! A thousand times so! Releasing their knowledge and experience that only a government funded entity could have amassed (c.f. raw data), has sling-shot a private enterprise capable of rapid innovation. This is precisely why I think that releasing the raw data is the most beneficial outcome.


Not really. No appointed official woke up one morning and thought "let's make an atom bomb". A bunch of scientists came to the govt and said, "we can make an atom bomb, we just need the funding".



Hmm, not sure API vs data is productive to see as a dichotomy.

Consider that an API might provide access to "raw" data directly.

I'd ask not for data over APIs but for more universally useful properties such as stability, currency, consistency, etc.


> Hmm, not sure API vs data is productive to see as a dichotomy.

It's not, necessarily. An API provides a [very] limited view into the data. A view selected by the publisher. That's part of the problem.

> I'd ask not for data over APIs but for more universally useful properties such as stability, currency, consistency, etc.

Amen. I'd add: clear license and usage conditions, simple and concise metadata, and frequent updates.


I think kjhughes was going for the possibility that one of the things exposed in the API would be downloadEverything()

Not sure whether that's likely, he was just pointing out that API vs data are not mutually exclusive.


I think this is more about changing the mindset of the typical workers at the federal agencies (hundreds of thousands of whom are not doing IT related tasks as their primary function). Those of us who work for these agencies as programmers & IT people have a big job to do.

Only recently have the govies started thinking about how their data can be useful to the general public. In the past everything had been stovepiped and guarded with peoples' live(lihood)s. Hopefully this makes them think about data integrity throughout the life of that data.

Think about how data used to be provided to the public before. A bunch of government folks had to collect data and make sense of it themselves and put it together in a report destined for congress. It's waaaaay different to just provide that raw data to the public. Rather, I think what we'll see is more sanitized data sets, after they've been internally analyzed and vetted (probably multiple times). Not exactly transparent.

But I hope one day, after many iterations of API building, we'll get to a point where the data truly is transparent.


> Only recently have the govies started thinking about how their data can be useful to the general public

The problem is that they can't anticipate how their data is most useful. Release and let the users decide.

> But I hope one day, after many iterations of API building, we'll get to a point where the data truly is transparent.

So, we should wait many years while, predictably, large budgets are blown on faltering "government innovation" projects, when we could just have the data and use it today?

This just doesn't make sense!


LOL I think we both want the same thing. I'm just being realistic. I highly doubt you'll get all the data you really want, this year or next. Federal agencies have a hard time releasing data quickly. Too many checks and balances, too many people blocking, too many lawyers and unions, too many politicians.


They've published a lot of data, too.


They seem to be noting their third anniversary on that, in fact: http://www.data.gov/


As is obvious to most on HN, requiring an API (as opposed to a CSV file release schedule, etc.) is fairly meaningless, and most definitely not a presidential-caliber dictate. Some agencies' data might be far better suited to publication in a CSV and posted on a web page, for example.

If a president could have a meaningful impact on this sort of thing, it would be in setting a high bar for the quality of information released by agencies. Any sort of requirement of this kind is completely absent from the announcement.

So rather than being about transparency as it's being touted, the announcement is a celebration of high tech obfuscation. Soon the same sort of insulting, opaque, useless information spouted by officials in press conferences will be available via HTTP. This is at best a neutral day for democracy.


Seems like the executive mandate is an "80% now" solution instead of waiting for perfection. At least they're thinking about the dissemination of information.


Seems like, if the President were interested in "now", they would have been working on this for going on 3.5 years, and would be actually publishing some APIs.


I'm going to my apartment now. 3.5 years ago I would not have been going there. Now and 3.5 years ago are not the same thing. He's interested in doing things now, and since neither he nor I can turn back the clock, that's as good as either of us can do.


Just like he's just getting around to planning his re-election campaign "now".

If the President of the United States wants to be impressive -- and I don't know that APIs are where one would start -- he can do better than make announcements.


They were putting out ads against Romney at the beginning of the Republican nominee campaign.


FYI data.gov launched in May 2009 after being announced by Obama's CIO in March 2009. This isn't a new effort, it's the continuation of ongoing efforts.

For example: http://ideascale.com/userimages/sub-1/736312/ConOpsFinal.pdf is a document (Dec 2009) documenting the vision for Data.gov, and http://www.whitehouse.gov/sites/default/files/omb/assets/mem... is a memoranda from 2009 that first started requiring the online sharing of major datasets.


Technically, having a bunch of HTTP resources that return data of type text/csv in response to GET requests is a perfectly valid API. It's also easy to build: just put those CSV files in a directory and have Apache serve it up.

How useful this is depends on how clear the data is, how well they document things, how sane their document formats are -- in other words, it depends on things that are much harder to mandate than just "have an API". I'll predict in advance that most of the APIs here will be pretty half-assed.


Technically having a bunch of HTTP resources that return data of type text/html in response to GET requests is a perfectly valid API. Indeed, it's one your web browser uses all the time. There are also plenty of products that let you set up such a system quickly.....

So I guess I agree with you but would even take it further.

Of course, if that API is Stuxnet or Flame-based....


I disagree. Certainly some agencies may have data that ought to be examined holistically via CSV (at least for researchers?), but having data available via API is better than the status quo.

While this might not set a "high bar for the quality of information," the President's effort shows a level of commitment to both technological streamlining of government agencies and to transparency and is, at the very least, a step in the right direction.


I'm not disagreeing with your point, but I wonder if you actually read the memo from the CIO, or are just speaking ex cathedra about the requirements being given?


>As is obvious to most on HN, requiring an API (as opposed to a CSV file release schedule, etc.) is fairly meaningless, and most definitely not a presidential-caliber dictate. Some agencies' data might be far better suited to publication in a CSV and posted on a web page, for example.

And for those cases, the agency can still justify why the CSV is better. There are always ways to get around the rules, see Section 508 rules for handicapped users.

>If a president could have a meaningful impact on this sort of thing, it would be in setting a high bar for the quality of information released by agencies. Any sort of requirement of this kind is completely absent from the announcement.

This is a step in the right direction, once the data is more accessible, the "users" (developers) can request for better data. The /DigitalStrategy page requirement is really good in my opinion and will make things simpler instead of the mishmash of sites buried in menus and behind authentication walls.

>So rather than being about transparency as it's being touted, the announcement is a celebration of high tech obfuscation. Soon the same sort of insulting, opaque, useless information spouted by officials in press conferences will be available via HTTP. This is at best a neutral day for democracy.

Whoever said this was about transparency in government? It's not about transparency per se.This is more about making information easier to access and find. In many cases agencies already have APIs, Webservices, data dumps, but they're really buried. How is making them more visible neutral?


I think this should be judged against the status quo as a positive development rather than against an abstract ideal as a flawed concept. Having seen too many clients stuck in analysis paralysis or blocked by political/turf issues while trying to develop corporate-wide standards (protocols, object models, etc), I'm just happy to see online access to public/government data advance in any way.

If we had to wait for higher-level, coordinating standards first, progress might never come.


Meanwhile, just yesterday the House Committee on Appropriations voted to [indefinitely delay][1] making legislative data available in machine-readable (XML) format. It's a repeat of a move taken in 2008 to "make a plan to make a plan" that never really goes anywhere. In other words, it's not gonna happen for a long time yet.

[1]: http://sunlightfoundation.com/blog/2012/06/01/bulk-access-de...


Don't expect someone to do something that goes against their self-interest. Especially if they can do a song and dance and make people forget that nothing is getting done.


"...Within 90 days of the date of this memorandum, create a page on its website, located at www.[agency].gov/digitalstrategy, to publicly report progress in meeting the requirements of the Strategy in a machine-readable format.... ...implement the requirements of the Strategy within 12 months of the date of this memorandum and comply with the timeframes for specific actions specified therein"

3 months to get a "machine-readable" status report on implementing an API?

Then, complete the implementation in 12 months?

If it takes 3 months for an agency to get a status report up, how long will it take them to implement said API? Government work, sheesh....


3 months to get a "machine-readable" status report on implementing an API?

2 weeks for the director of each agency to delegate someone to be responsible for this. 2 weeks for said responsible person to figure out what an API is. 6 weeks for them to go around to everybody in the agency asking "are you doing any APIs yet?". 3 weeks to take the feedback and turn it into a semi-coherent report.

If anything, I think 3 months is optimistic.


They need to be sure they 1) are exposing everything they need to be exposing, 2) are hiding everything they need to be hiding, 3) are ready to handle potentially significant load, and 4) are robust against attack. Each of these is a little harder and/or a little more important in government than in a start-up (your start-up is probably less likely to get an informant killed because they posted the wrong thing than, say, DOJ), and 1 and 2 especially involve more than technical work - processes need to be set up to actually get the data where it needs to go in the first place. I'd love it if it could be faster, but I don't think 15 months is at all absurd.


Actually, that's just project management 101. The fact that it's government work doesn't make the steps to delivery any different than anywhere else, just the subject matter. Or, it shouldn't be any different -- though I know some claim otherwise.


You need to do 1, 2, 3, and 4 regardless of where you are, yes, but my point was that each of those is larger (in scope or import) in the context of a large, long running government agency than in the context of a startup or small company.

Remember that this has to accommodate the agency in the worst situation. The alternative would be to give separate deadlines for each agency in the memo, but that may be hard to determine from the outside (after all, "what are you going to do, how, and when?" is the first thing he's asking for) and you still have to wait for the last one to have your "Hey, everyone, there's data! Have at!" press conference.


I get where you're going, but it doesn't change the equation.

This is basically a conversation about project scope and management. Assuming this is more complex simply because it's a government agency is flawed thinking, though. I can certainly perceive complexity due to expectations or maybe the participants themselves (I've worked with some fed agencies), but that should have zero bearing on addressing the project itself.

But in reference to some of the specifics of how this is different (sensitive data, system load, system attacks, worst-case-scenario) -- this is begging for an iterative project approach as opposed to a waterfall basis. In spite of the fact that these aren't unique issues to government agencies, the context of addressing those issues might be (I have some idea of the technical capacities of our government.) I can accept there may be a learning curve in this project, but believing this project yields unique problems that haven't been dealt with elsewhere -- respectfully, that's just wrong.

If this is all very new to these federal agencies, that simply suggests a smaller initial scope that can grow after multiple iterations. Teams can learn about requirements, refine their APIs, shore up their operation, etc. Not to over-simplify, but this is basic project management.

Again, the context of the government's perview may be different than little-startup-dot-com, but the process of getting from A to B isn't really all that different.


They'll simply report they haven't been able to implement the report, someone will get a slightly lower evaluation score and life.gov will go on as usual.


Cutting isn't hard. Deciding exactly where to cut is.


My best guess is that in the initial 3 months they wont be doing just the page, they will be outlining the efforts in order to comply in the next 12 months with the executive order, buy any hardware they must buy, and delegate any work to any branches they might need to


I'd like to see you hire an entire department's worth of people, wait for them to devise an API for your agency, code it, provision server space and servers, and deploy in 12 months!


I get that this is a rhetorical question, stated in such a manner to imply I have insufficient context of the real requirements of this project and cannot gauge the size and complexity of it. But, I've got a full cup of coffee, so I will take a shot to see if I can address this impossible situation.

  - hire an entire department's worth of people
The worst approach I can imagine is to start this project by hiring new, dedicated people. This project only succeeds by incorporating it into the very fabric of an agency's operation. Remember, new work must go on even after this API is in place. Hiring a dedicated group that somehow has to reverse-out everything the agency does going forward for purposes of external API access is a recipe for failure. Separate teams will already be at odds with each other; better to leverage the existing teams, as their the ones in best position to understand the context of how external access to their operations should function.

  - wait for them to devise an API for your agency
I'm not sure if this refers to a lack of understanding the requirements, or to a lack of competency on the part of the team itself, but the presumed outcome is late delivery. If it's complex requirements, that simply suggests a basic, iterative project where scope is managed tightly and the duration is rather short (allows the team to learn as the project moves along.) If the suggestion is competency on the part of the team, hiring/contracting a few competent individuals to align with team leaders on the project has worked well for past projects. In either case, proper management of the project approach can address issues of timeline.

  - code it
This goes to team capability, but also to an understanding about integration. Again, solvable problem based on the team capacity. If this suggests a unique code stack that's not already available elsewhere, I'd need to understand the justification. Project management 101.

  - provision server space and servers
This implies the technical operations of our agency are fully-loaded, or can't address this in a reasonable timeframe, or purchasing requires some inordinate amount of lead time, or some other unknown reason. If the suggestion is that dependencies exist that threaten the timeline, those dependencies should be mitigated. Again, project management 101.

  - deploy
This is a physical step of pushing bits live, so automation tends to make this a quick step. In my project, we thought about deployment actions when we determined how to devise our API. At this point, deployment is an operational aspect -- not a how-do-we-do-this function. This project doesn't proceed without understanding this step.

Disclaimer: I've actually done this type of project work for significant operations of size and complexity. I've also worked with a few federal agencies, so I have some familiarity with the lay of the land.

It's really not that daunting of a project. I would respectfully suggest that you re-consider your own presumptions, as there are a lot of ways to address this type of work.


are you kidding? an "entire department worth of people"? uh, how about 1 consultant for a few months and it can be hosted on an existing server. this is an api to provide access to data that already exists, not a rocketship to the moon.


This seems quite relevant now: http://xkcd.com/927/

What will this bring? Well, the US govt has X agencies. The result of this decree will be that, within 12 months, all of the public will get to enjoy the thrills of having X incompatible web API's, one unique one per agency.


Better than it is now, where none of that data is available except possibly through FOIA lawsuits.


That hasn't been true in my experience.

What data are you looking for that isn't available?

The only thing I can think of that's difficult to get is some law enforcement, military and "national security" related information, but APIs aren't going to change that.

I don't have a ton of experience with it (mostly USGS and NOAA), but the Department of Commerce and the Department of Interior both make a ton of data available.


This reminds me of the push here in NYC for all of the city agencies to open their data via an API. It's gotten better over time, but when the initiative first took flight, it was terrible. Some of the APIs flat out did not work, and the ones that did often returned all sorts of malformed, non-normalized data. It was a nightmare to work with. I'm curious if the government can do better.


This is kind of interesting because maybe it shows how far the thinking is from technology right now. I can't wait to integrate FBI files into my web app, and maybe I can bypass 'authorized e-file providers' to file my taxes. Maybe I can download daily spy satellite imagery. My point is that what is already meant to be available is probably already available.

Decision makers are often excited about technology but don't really get the ground level experience. They want to do all the things...on a roadmap...with milestones. Mobile has to be involved in some way.


This will likely go down the same way the original IPV6 mandate went down, before it was postponed, and before it likely will be postponed again when nobody's met the mandate.

The issue is far more complicated than the comments I see in here are giving credit for. Don't get me wrong, there's going to be delay as the PHBs get themselves wrapped around what an API even is, but they'll have the directive routed to their CIOs before that, and they will understand the requirement, and how impossible it is.

The biggest issue is that the data isn't really owned by the government entities. I mean, the data is theirs, but it's locked up in their vendor provided tools, and/or their custom, built-by-vendor products. If they're using Oracle AquaLogic (or whatever it is now) to host the majority of their portal content, they're dependent on Oracle to either come in and show them how to implement the feature (which is a significant service dollar cost) or they're going to have to wait until Oracle builds the ability for API exposure into the product if it doesn't exist yet.

If they've got custom-built portals, they'll need to consult with the vendors who wrote them or maintain them now and get them to add that in. That means that they'll have to modify the contract originally bid for the project, which is going to eat up a couple months of the timeline alone. Then they'll have to figure out what sort of things actually make it into the API, how to segment sensitive data reliably, get it through ISSO testing, etc. It's almost impossible for a project of any significance.

On top of that, they'll have to do it with a budget they don't have, and with resources allocated elsewhere. The only way the government really gets anything done is by committing large amounts of resources to it in an uninterrupted fashion. They don't have the capacity to be agile, and to some extent, that's by design.


Its going to be a pretty huge undertaking.

I assume that this will lead to some discussion on API standards, as multiple agencies simultaneously realize the implications.


15 years ago I contracted with several large federal agencies. Back then, we were pushing for the same thing. It never flew.

I imagine after 15 years they may have a chance at this, but I would caution those of you who have never worked in huge government IT shops to take this with a grain of salt. The situation is so bad in many places that Congress has been passing laws making it illegal for the federal systems not to behave in a certain way. And still things are broken. We passed the point of desperation many years ago.

Big IT in general is broken, and government IT is the most dysfunctional of any IT on the planet. I remain hopeful that this executive order can accomplish something, but I'm not holding my breath on it. Hopeful is one thing. Excited like this guy is? Not at all. Maybe in another 15 years. Maybe.


I'd be interested in the NSA's API...


Just a new HTTP code for the other services to implement:

    312 Pay No Attention To The Proxy


Exposes a single REST endpoint. POST /UploadAllUserData.

Meh, they're evil, it's probably SOAP and you have to discover it.

edit: Sorry guys! Didn't realize a tongue-in-cheek comment about the NSA have an API was so super-serious! It's like Oprah in here giving out downvotes anymore. "It's a free downvotes for you and you and you."


In enterprises where departmental data has been opened up through APIs, like Wells Fargo and Amazon, there have been tremendous benefits.

In the case of Amazon, this was achieved by CEO fiat, and strongly tied to employee evaluation. (To the point where employees in groups that failed to do so would have been evaluated right out of the company.) I wonder if POTUS has this kind of power over the federal bureaucracy.

Also, I would wonder if this is to be done securely.


As a programmer/contractor for DHS, I'd love to hear what you all think would be a useful set of APIs for DHS to make public. It's all fine and dandy to say 'oh yeah we have an API' but it needs to be something useful. So what would you want to see? Financial/Budget type data? Performance metrics across the different Components within the Department? What?


How about TSA budget, # of TSA employees, # of terror plots stopped, some ratios between those things, # of nail clippers confiscated per employee, etc.

;)



Weird, I searched for the URL and also thought HN would automatically detect a dupe. Didn't mean to hijack a post!


The URLs aren't identical, one has "index.php" at the end, the other doesn't:

yada yada ... barak-obama-directs-all-federal-agencies-to-have-an-api/

yada yada ... barak-obama-directs-all-federal-agencies-to-have-an-api/index.php

With regards searches, it depends on what you search for. HN_Search is not always obvious, and it doesn't index minute by minute. You may have done the search before the engine caught up.


In rudimentary terms, I suppose at least it's a step in the right direction. It doesn't necessarily suggest the US government is going to immediately embrace openly disseminating its data, but it's still a step in the right direction. Third party services will most likely proliferate quite rapidly.


It'll be interesting if they decide to use NIEM (National Information Exchange Model) as a way to transfer information to the public as well. https://www.niem.gov/Pages/default.aspx


The first thing that caught my eye was the misspelling of Barack in the address :)


It's hard to imagine some federal agencies being able to do much of anything within 90 days. But, I look forward to poking around with some of these APIs.


Wish they'd done some reasonable amount of procurement reform before this. All this means is another big payday for gov contractors


The two issues are completely orthogonal.


Actually, no, they're not. They're tied at the waist. Governments antiquated technology acquisition schedules mean that govt cannot tie itself to Moore's law. Because currently government buys from an exclusive system rather than an inclusive one, it means the tech they do end up getting, often 18 months late, isn't very good, and is very hard to upgrade.

Thus a mandate that "all agencies should do x" with tech, before reforms in the acquisition arena create severe problems down the road, and only really benefit the contractors who make them.

By the time these Apis really see the light of day, we will be complaining about them.


I disagree. If your to-do list consists of "learn to touch type properly" and "write a better compiler for C++", you'll find some ways of prioritizing your to-do list more conducive to accomplishing your goals than others.


from dod import air_force

air_force.launch({"f22": 3, "b2": 4})


hope we can apply for H-1B similarly real soon


Get ready for some hackathon! I'm sure there are a lot of useful and interesting ideas to build with whatever comes up.


Has he been reading Steve Yegge's redacted post?


API's? C'mon.

Bulk data.

Agree with polemic.


It's called a Web Service noobs!


API is much better than nothing. I say, begin with API, then require any API to have a basic "raw_data_dump (start date, end date)". And yes, this isn't about transparency, but it is more about efficiency, and still can lead to transparency.


Bloomberg "learning to code", Obama directing agencies to have an API.

All part of politicians (particularly Democrats) trying to look like they have a clue. Give up already for heavens sake and get back to managing the deficit.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: