What really is a record and What are we preserving: Viewing Nitrate Films

What really is a record? I don’t plan on answering that, but it’s a good thought for Monday especially after watching this Vice Daily on the Nitrate Picture Show at the George Eastman Museum. The museum is one of the places that has a theater capable of still showing original nitrate prints. Some of the interviewees keep mentioning how the experience viewing these films is different for each viewer. Of course this subjectivity begs the question that we should all ask in archives, is it worth it to preserve the original and allow it to be viewed when its soooo complicated. The George Eastman Museum is of course perfectly capable of doing this, but the there’s a bigger question here about how can we make preservation decisions when the user experience is ultimately subjective? Fun thoughts for Monday morning.


Going to this is definitely on my list of things I have to do.


That’s it I’ve had it. Let’s kill the term Web 2.0. It’s dated and completely misrepresents what’s really happening online today. To be perfectly honest, it makes us in the information profession look way bad because we insist on using this term or even the 2.0 terminology.

Origins of the Term

I get that it’s still used in professional literature in both the archival and library fields. Where did this term come from? What was the reaction to it?

To start with if you don’t already know, the 2.0 bit relates to software versioning. It’s implied that with web 2.0 that we’ve reached the second version of the web, which is problematic at best because the software that underpins the internet is various and diverse, and if we were following those versions, we’d probably be up to version 746 or something. This actually presents a problem worth addressing further a bit later.

Briefly this term was first coined in 1999 and popularized in 2004 by Tim O’Rielly, namely at a conference he started to explore the web, its content and tools. 1. On a basic level, the term is taken to mean interactivity and dynamic content online. This is often thought about in terms of user generated content.[^2] The reasoning for the 2.0 moniker was that this was a new version of the web that was greater than the previous version of the web that was seen as static and not dynamic. Tim Berners-Lee didn’t like the term and thought of it as jargon, but he was personally excited by the idea of the semantic web.[^3] So taken together we have a term that’s around 17 years old, and had problems even when it was coined. In some respects using a term like 2.0, calls to mind the new coke fiasco. “New Internet, now with more content”. It’s buzzy, but more on that later.

As side note, ALA has adopted the software versioning approach seen for web 2.0 for their own purposes.  We now have Library 2.0, shinier better, but with fewer books to get in the way. I kid, but this suffers from many of the same issues of web 2.0. It’s just jargon and doesn’t tell our users anything. 2

Reasons for Moving away

So what’s wrong with the term? Let’s start with Berners-Lee’s comment about it being jargon. It most decidedly is jargon, or a term that’s used by a profession that usually has little meaning to those outside of the profession. For a while, it may have made those of use in libraries feel a connection to the tech field. It may have made use feel on the frontier. It’s a fun little term us information professionals can throw out to make ourselves look “modern”.  How many times have you heard or said the following? “Look administrator we have the web 2.0’s because our organization is on Facebook.” The reality is that you aren’t doing anything other than crossing of a check mark and throwing, well lobbing is more right, a buzz word around. Our literature almost demands that you have web 2.0. Upgrade or perish!!. As anyone, who’s really worked with social media in libraries know that this isn’t something to jump into without planning. How an organization works online is a programmatic initiative not just that really isn’t dealt with by a term web 2.0.

Another reason to avoid jargon, is that it confuses our users or they may not understand. Some may say “Neato my library has web 2.0”, but ultimately they care about the interaction they are having with your organization. Focus on the interaction, what does the tool actually do to help a meaningful and positive interaction? Then start talking about that interaction, not about this web 2.0 mess. If its so users can use materials from home and supply their on comments, tell them that. If it’s so they can create their on research collection, explain that. Users will appreciate being able to understand you without jargon getting in the way.

Another issue I have with this term is that it’s not really precise or kept up with the advancements in online communication and systems. Most libraries and archives have really focused on the user-generated content aspect of web 2.0 and this aspect alone is what much of the literature deals with this. This runs the gamut of facebook comments, tagging in catalogs, image annotation, etc. So really, for me when I see this term use, it’s about how libraries and archives interact with their users to both help them discover content and provide meaningful description of that content. This is a great pursuit but because this is the usual usage, it misses a lot of what’s offered by the modern web, such as access to big data, the semantic web, collaborative and real-time interactions, private sharing, etc, etc. Essentially by focusing on this term and its understood meaning in the field, we are limiting our understanding of what work can be done online and what we can do with online resources. We are also solely focusing on one aspect of online work and limiting discussion of other types of work that can be done because it doesn’t fall within this intellectual frame-work. This could potentially hold informational professionals back intellectually.

So to repeat, reasons that web 2.0 is bad. It’s jargon and It’s not truly precise. I say we start saying what we mean. If you are using Facebook, say we are working with social media to interact with our users. If you are working with big data, say that you are doing that. Be open in your explanations and avoid the jargon that does explain what you are doing. Maybe this is just a pointless rant. Maybe I should embrace the collective will and succumb to the usage of this term. Maybe be the only one who hates this word, but I’m ok with that.  I’ll just have to be happy in my little, corner of the web covered in ash, and calling for the burial of Web 2.0.

  1.  http://www.paulgraham.com/web20.html
    [^2]: https://en.wikipedia.org/wiki/Web_2.0#cite_note-graham-1
    [^3]: http://news.bbc.co.uk/2/hi/technology/4132752.stm, http://www.ibm.com/developerworks/podcast/dwi/cm-int082206txt.html 
  2. http://www.ala.org/tools/atoz/library-20 

Review – Ratchet and Clank (2016)


The new PS 4 exclusive Ratchet & Clank is sort of reboot/re-imagining of the classic Playstation Ratchet & Clank games. This 2016 take on the series starts back at the series beginnings, but still includes the hallmark guns that this series has been known for since its first appearance on the PS2 over a decade ago. The story in this case ties into the movie set to be released on April 29 and closely mirrors that first outing in 2002. Although in this version of the original game, Dr. Nefarious is introduced and plays a major role. The game sticks to its platformer roots fairly closely.

The Good

Ratchet & Clank™_20160413233319
Ratchet & Clank™

Let’s start with the good. Graphics are decidedly improved. Even though I’m not a fan of solely judging a game based on its graphical prowess and polish. In this case, the PS4’s power is put to good use, but I did occasionally get some shuddering when a large number of enemies met my good bouncer. The environments are without a doubt gorgeous. One plus is that many aren’t jumping onto is that this game is cheaper than most games from the larger studios. Starting at 39.99, with not costly DLC in sight, this game is a more respectable use of your gaming money.

Other pluses to this new installment in the series include sticking to the old formula with slight updates. This game feels like a Ratchet & Clank game should and scratches all the platforming itches in the best possible way. There are some notable improvements to game play, including the ability to strafe.

The Bad

Ratchet & Clank™
Ratchet & Clank™

My biggest complaint is the relative brevity of the game. Considering this game cost only 39.99, maybe a campaign of longer than 10 hours shouldn’t be expected. That aside a true fan may not feel like they have had their fill of our lombax and his mechanical side kick by the end of their first play through.

The upgrade systems for the weapons and Ratchet in general has changed mostly for the better, but a small grip is the card collecting mechanic. Collecting what the game calls holocards nets the play boost to certain stats and omega weapon unlocks. This mechanic works as cards are scattered throughout the levels and are also acquired through random drops. The issue I have is that there’s not mechanic for dealing with cards once you have them all because you keep collecting them once your sets are completed. This seems like an oversight and missed opportunity for some sort of reward for players attempting to 100 percent the game. Maybe the ability to trade in for bolts, which would help with unlocking omega weapons would have been nice?

Also the game doesn’t seem particularly difficult, even on challenge mode on the hardest setting. Challenge mode, incidentally, is unlocked after completing the game. Maybe I’m just that good at these games, but I can say that I’ve been challenged all that much during my first play through and even most of the way through challenge mood. Maybe this isn’t the purpose of the game. Maybe it should just be a game for a good time, which Ratchet & Clank excels at even when it’s not that difficult.

The Verdict


I think this is definitely one you’ll want to play it now. I have minor issues with the length, but if you are a platformer fan, this should find a way onto your gaming queue. It’s great fun even when it isn’t too difficult.

Exam Season

Picture1I have a history of using meme’s to announce exam period is beginning. I confess to using bleak cultural reference to do this because it makes me giggle.  Hopefully a bit of humor breaks up the test taking stress. I know I always appreciated it as a student. Here’s to finals season. May the odds be ever in your favor.afab1157-5d16-4fb9-94ce-34180bd32a4d

Freud, Derrida and Electronic Records – Beginnings

For some reason, hubris, scholarly ambition or possibly madness, I’ve begun an extremely slow and careful reading of Archive Fever by Derrida. Often times postmodern issues, this work in particular, crop up in classes, research and impolite conversation, and I felt that it was needed to really give it a careful reading and test my scholastic mettle. If you’re not familiar with this work which began life as a lecture, the major premise of the pieces is that it’s essentially a “fever” or a disease to believe that the archive is a reputable repository for information. That in essence the archive is a poor place due to many complications for our collective memory. Derrida explains this by investigating Freud, and his many concepts of individual memory.

Unknown to me was that as I approached finishing the “Exergue”, I noticed that Derrida began what I consider a hasty conversation about how electronic records and their impact on archival veracity. He calls technology, ”these radical and interminable turbulences.”1 He further posits, and suggests that technology today causes issues with what is archivable and the process of archiving itself. Derrida seems to imply that older documents are more reliable records than modern records by implying that their creations had a certain intentionality, and that the intent to create them makes them more trust worthy. Where as, more modern records have been impacted extremely by technology and “archival structure”, which lessens their reliability. This is an interesting point, if this is Derrida’s intent, with this short aside on technology and archives. Unfortunately, it is quickly introduced and left with the reader more left to ponder such ideas as the “Mystic Writing Pad”.2

Derrida does promise to return more to this issue later in his work, but really this is an interesting premise that begs an important question. Does the fact that new technologies exist that remove some intentionality from the creation of archival record somehow remove trust in the archives? My opinion is no it doesn’t, but I reject much of the idea behind Archive Fever the more I read of it. This doesn’t mean that their aren’t unexpected questions that can be asked of the archival profession from engaging with this complex piece. So just as Derrida promise to explain himself better, I too promise to keep reading.

  1. Archive Fever, 18. 
  2. This Atlantic piece discusses the concept of the Mystic Writing Pad in a better context than does Derrida.http://www.theatlantic.com/technology/archive/2013/01/the-mystic-writing-pad-what-would-freud-make-of-todays-tablets/272512/