Last Day at The Blue Review

Four years ago, a small group at Boise State launched The Blue Review with a quote post from Walter Lippmann, attempting to explain our tagline — “Popular scholarship in the public interest.”

Lippmann wrote in his 1921 book, Public Opinion, that the primary defect of representative government is, “the failure of self-governing people to transcend their casual experience and their prejudice, by inventing, creating, and organizing a machinery of knowledge.”

Lippmann tasked journalists with at least organizing, or explaining, the knowledge that our society produced.  My job, as a journalist and founding editor of TBR, was to serve as a bridge between town and gown, to publish articles grounded in academic research but written for the general public on topics of local interest.

We wanted to transcend the trivial and dull, as Lippmann might have put it, in favor of “lively commentary, informed scholarship and critical conversation on politics, cities, the media and the environment,” as we put it.

It is because they are compelled to act without a reliable picture of the world, that governments, schools, newspapers and churches make such small headway against the more obvious failings of democracy, against violent prejudice, apathy, preference for the curious trivial as against the dull important, and the hunger for sideshows and three legged calves. — Lippmann, Public Opinion

And we did. Over the past four years, we published almost 400 articles from some 175 scholars and public intellectuals, garnering more than 2 million page views and curating special issues on elections, schooling, cities and race. It was fun, stimulating and fascinating to meet so many energetic and well-read professors at Boise State and beyond.

The Blue Review, Winter 2014
Winter 2014 “Racial Discrimination in Idaho” cover with “Skeletons” art from Bobby Gaytan.

We achieved a tone that I think Lippmann would have appreciated, but we also injected some of the democratic ideals of John Dewey into those pages, inviting writers from all walks of life to contribute, hosting public forums and debates and opening up the comment sections. This formula — a highly curated, well-crafted, transdisciplinary, town-gown blog — remains unique in higher ed, and I want to thank Boise State University and all the people who supported it over the years for providing such a forum.

This is my last week at the helm of The Blue Review and I’m eager to transition from editor to reader. Jill Gill and Justin Vaughn, who co-direct the Center for Idaho History and Politics, will become the publication’s new stewards. Justin and Jill have both written extensively for TBR and are committed to scholarship in the public interest and the type of community engagement we’ve cultivated these past four years.

I am moving on to a new chapter of my writing career, which I will announce next week — still crafting and organizing the “machinery of knowledge,” but in a different context and forum.

Here’s a few of my favorite TBR posts from the past few years:

There are many, many more, and I encourage you to explore the archives (hint: change year in URL), to add your voice to the comments and to keep an eye out for future TBR events.

Here’s to the next four years!

 

Hackfort 3: Why Hack for the Community?

For the past two years or so, I have been slowly learning to code, and I’m doing it for one main reason: There are systems out there that I want to help fix.

One of those systems is the Idaho state legislature’s web site: it bothers me that bill and legislator information is buried so deeply and difficult to find and lacking in useful detail, and so I am working on an alt legislative website that will make it easier to find and track Idaho (and maybe other state) legislative info.

I also recently got involved in a grassroots election results hack, Open Elections, that is attempting to standardize election result reporting across the country. I helped build Latino208.com, and I’m working on civic-minded web projects in both of my day jobs, at Boise State and in the city’s IT department.

Hackfort 3Next week I’ll be working, on my own time, on an app for the Hackfort Hackathon, the civic tech track at the Treefort Music Fest in Boise. I volunteered for the civic track committee, and, full disclosure, I also helped pull together the City of Boise’s Hackfort data offerings that may aid in the hackathon.

[Lots of Hackfort events on FB]

This year, Jimmy Hallyburton of Boise Bicycle Project put out a call for a bike crash reporting app, which inspired the general topic of “hacking Boise transportation” for this year’s hackathon. The idea is to crowd source a database of otherwise unreported bike crashes in order to help identify the causes and location of bike-auto collisions and ideally to prevent them in the future.

Inspired by a call to action from the Boise Bicycle Project, #Hackfort3 asks developers, designers and civic hackers to consider: What data and/or software do you need in your pocket in order to get around town better?

I really hope a team or several teams tackle this idea — it’s valuable data to collect, community driven, provides potentially life-saving information to cyclists as well as drivers, transportation planners, police officers and researchers and supports the mission of a kickass local nonprofit. Check the Hackathon page at Dev Post to find a team or start a new one.

Here’s nine ideas for apps, some based on available public data, for anyone to use (leave more ideas in the comments or at reddit):

  1. The Boise Bicycle Crash Reporting app, mentioned above, of course.
  2. Skateboard route finder, with skater friendly paths, byways, stairs and bowls marked — perhaps a Google maps hack for skater routing?
  3. Better Boise Bus Finder: ValleyRide now has a bus geolocator, but perhaps there is a better implementation of it out there?
  4. A Boise GreenBike dashboard that shows how people get around Boise by bike, as the Ada County Highway District considers more permanent bike lanes downtown and elsewhere (use the ACHD bike lane/route map and the Boise GreenBike 2015 ride data and realtime bike availability API!) –> See EXAMPLES
  5. Ridesharing, for real. Uber has already disrupted the taxi economy in Boise, but there may be a market for legit ride sharing (maybe not?) based on workplaces, recreation venues (Bogus?), etc. (Think Park & Ride meets Uber…)
  6. Tours: Lots of tour ideas, depending on your fancy, but last year the winning Hackfort team made a random date generator. This year, you could help people get to their dates: directions between park amenities, add Greenbelt data, Foothills trail data, mash up other APIs, or non-food Yelp amenities
  7. Analyze crashes that involve cyclists and ARE reported to find patterns, seed the bike crash app mentioned in No. 1, save lives.
  8. Model future scenarios for transit in Boise, i.e. trains, planes and automobiles in 2030, 2040… perhaps a game?
  9. Put your favorite stuff on the map, the OpenStreetMap.

The hackathon has been open for some time, but convenes formally at Trailhead on Thursday, 3/24, after 7 pm, as kind of a formal kickoff (Hackfort opening party starts at 5 pm that evening). Finished apps are due at 9 am on Saturday, 3/26, and will be demoed at 3 pm, also at Trailhead, to be followed by the awarding of prizes.

See you @HackFortFest! (And remember that we also need designers, users, people who transport themselves, idea people — civic hacking is not just for hardcore programmers…)

A few (bunch o’) key Hackfort links:

Other stuff I’m helping out with at Treefort this year:

“Journalism’s New Proof Texts” paper posted for critique

In this sense, the new rise of data journalism mirrors the age old spectrum of the news industry, from yellow to red. It matters little whether the bits or blobs are rendered in lampblack and gum arabic or 1s and 0s. What matters, what has always mattered, is the consciousness of the renderer, the corporate bounds within which he or she works and the attitude of the news consumer. Will data journalism progress along the lines of the corporate backed, market-driven Princeton Radio Project of the 1930s or will the new data journalists take heed of Adorno’s warnings against “stating and measuring effects without relating them to these ‘stimuli,’” the qualitative, objective influence of culture and society on consumers (Adorno, 1969, p. 343)?

From “Journalism’s New Proof Texts: The Peril and Promise of Data Storytelling,” a paper I wrote for Ed McLuskie’s Advanced Critical Theory class at Boise State, now posted for critique session at Academia.edu.

Elevator pitch for Knight #newschallenge on data and communities

I’ve been working on an idea for the Knight Foundation News Challenge on data and communities and wanted to share the evolution of our one line, elevator pitch for the project.

The proposal is to take the idea of a “data repository” (like data.gov, or the City of Boise’s growing data portal hosted by ESRI’s Open Data platform, which I’m also working on), that offers bulk data downloads of civic info, and add two more types of data to the catalog: research that actaully uses the data and media reporting on the data.

I call this “data in its context,” or “the work done on the data,” and I think it will be convenient to have it all in one place. Also, I think average citizens will be able to make better use of it, better interpret the numbers and contribute back to the research and reporting with their own local insights.

My first stab at explaining this was pretty high level and I still like it:

Draft 1

We’re organizing the web in Boise around community data, locally relevant research, government reports and the news in a structured way that scales to local internet spaces around the world.

But it did not speak to the power of communities harnessing their own data, which is the point.

Draft 2

We’re organizing the web in Boise, Idaho, around community data, relevant research, government reports, local journalism and public ideas in a structured way that scales to communities around the world.

Someone pointed out to me that its not the web that needs organizing, it’s the locally relevant data, thus:

Draft 3

We’re organizing community data alongside relevant research, government reports, local journalism and public ideas in Boise, Idaho with a web app that will scale to benefit communities around the world.

Then, how will it benefit these communities?

Draft 4

We’re marshalling community data alongside relevant university research, government reports, local journalism and public ideas in Boise, Idaho with an open source web app that communities across the world can use to tell their own data stories.

Finally after much work and input from many folks, including and intense “Red Team” (pdf) session at Boise State, I arrived at this draft:

Draft 5

Data Cairn is a platform for data storytelling, starting in Southwest Idaho, that allows communities to harness their data along with the work being done on it: relevant university research, government reports, local journalism, visualizations, public ideas and more, in order to discover and demand better solutions.

The feedback phase for the Knight News Challenge is open for one more day, so feel free to leave more feedback and applause, if warranted, on our proposal. There are tons of other cool projects on there as well. Well, 1,028 other cool projects …

Here’s a few I like:

Scraping Chimp Photos for the Commons

Awesome animal copyright fact: All chimp photos and paintings are public domain.

The Office will not register works produced by nature, animals, or plants. Likewise, the Office cannot register a work purportedly created by divine or supernatural beings, although the Office may register a work where the application or the deposit copy(ies) state that the work was inspired by a divine spirit. – via COMPENDIUM OF THE U.S. COPYRIGHT OFFICE PRACTICES , Third Edition

Linking government and academic open data

I’m doing some reading on the open data movement for a new project that we will announce in a few weeks and came across an interesting history of the human genome project. I’m looking at links between the open data movement, which is mostly concerned with public, government information and its release in free, usable, digital formats, and open data in academia — university research data — which is often treated as private or somehow protected information.

Human Genome Project logo
Human Genome Project logo

In 2011, Jorge Contreras wrote about the Bermuda Principles in the Minnesota Journal of Law, Science & Technology (SSRN link). The Bermuda Principles, agreed to in 1996… in Bermuda, stated that human genome research data should be released to the public within 24 hours of being collected.

According to Contreras, key researchers and genetic policy thinkers agreed to Bermuda for three reasons:

  1. To aid project coordination,
  2. to advance science, and
  3.  to minimize encumbrances to research that patents on the human genome would cause.
1000th protein structure
Argonne’s Midwest Center for Structural Genomics deposits 1,000th protein structure / Matt Howard, Argonne’s Midwest Center for Structural Genomics deposits (Licensed under CC BY-SA 2.0 via Commons)

Number three is very interesting and certainly has application in other areas of the sciences. But the bigger concept that Contreras analyzes — the idea of openness and data sharing in scientific research also applies in many other areas.

As discussed above, the more quickly scientific data is disseminated, the more quickly science will progress. Conversely, when the release of data is delayed due to the length of the publication cycle and patenting concerns, it can be argued that the progress of scientific advancement is retarded, or at least that it may not achieve its greatest potential. If data were not withheld until a researcher’s conclusions were published, but released prior to publication, the months-long delays associated with the publishing process could be avoided. Following this line of argument, in an ideal world, maximum scientific efficiency could be achieved by reducing the delay between data generation and data release to zero. That is, the most rapid pace of innovation, discovery of new therapies, development of new technologies, and understanding of natural phenomena could be achieved by releasing scientific data to the community the moment it is generated. — Jorge L. Contreras in Minnesota Journal of Law, Science & Technology2011;12(1):61-125. (Emphasis added.)
More on this later, but it dawns on me that Github is basically designed to reduce the delay between data generation and data release to almost zero: update, add, commit, push and the public has a current and historical view of your data manipulations.
Find more data journalism and open data sources on one of my tumblrs: http://journalismprooftexts.tumblr.com/