Question 166 of 365: How can we scrape better?

Question 166 of 365: How can we scrape better?

A social network diagram
Image via Wikipedia

I guess I might as well go on the record for stating that the future of networks is in scraping. I know that some people are calling it the social graph and the power of the connections and links and liking and all of that, but I think that it is much better to just call it scraping. Turn on the full twitter or facebook firehose if you have access, but it doesn’t mean anything unless one person sees significance in the data that is being scraped and served up to them.

Just so I am clear, scraping is something that is done with the links part of sharing. It is done with the location part of publishing, the metadata about all of the things that we are passing around. It is in the description of the thing rather than the thing itself.

The benefit of such a thing was made real to me by a single product called DejaPlay. This video pretty much sums up its features:

Suffice it to say, though, this iPad app makes my network real again. I can sit back and watch the things that all of my friends and colleagues have been talking about and I can finally engage in the process of catching up on the inspiration that everyone around me has been calling upon. This app simply scrapes all of the youtube and other video links from your facebook friends and twitter followers and then creates a video playlist for you to enjoy. It does this one thing incredibly well. So well, in fact, that it makes me never want to click on a link again without it being served up to me in a better format.

Twitter Times does this as well with text-based links (as does Google Reader and Delicious to a lesser extent). It creates a fully functioning newsletter for all of the links that have come through your twitter feed, but I am afraid that this just isn’t enough after watching DejaPlay work its magic. I don’t want a list anymore. I want a tactile exploration through what is going on within my network. I want to explore and have it autopopulate as new information becomes available. I want to dig deeper into what people are reading and watching and see the context for everything that they are sharing. I want to know who the people are that are creating the work I am consuming, and more than that, I want an elegant interface that shows me the path that I have taken down the rabbit hole. While DejaPlay doesn’t currently let me retweet the videos I am watching, I have received direct correspondence from them that it is coming in the next major release. In doing so, the experience would be complete: Network, Scrape, Consume, Contribute.

That is what the process should always be, not to put to fine a point on it. We network to create a space worthy of inhabiting. We find people who are a part of the conversations that we wish to be involved in. We connect to them and start thinking through the problems that we want to solve with their help. We scrape (or should start scraping, anyway) because we know that the value of their contributions is not simply in having them close at hand, but rather in knowing that they are carefully curating our library of knowledge. We have selected them in the networking phase and they are providing us with dividends simply by choosing them. Then we must consume, ravenously, everything that we can from our network. We know that it is good and so we must read and watch and absorb all of the good things that others have to offer us. This type of consumption, unlike our daily bodily intake, never leaves us feeling full. The more that we consume, the more we want to take more in. We make the connections that were always there waiting for us to make them real. Our final step is to contribute back to the network. We curate and add to other people’s libraries. But, now we need to make sure that we are remixing and recontextualizing. Here is where the future is going to come in handy.

Our stuff must get more scrape-able. When we share things, we must share them knowing full well that their contexts must be shifted a hundred times for our network. We must realize that the lists and posts that we have conjured up are meaningless without the ability to dissect them.

Here is what I want:

I would like the ability to see all of the people that I have carefully chosen as a part of my network and I would like to be able to choose what kind of media I would like to scrape from their various shared places. I would then like to be able to flick from their profiles (using my hands, of course) their videos into one corner of my screen. I would like to be able to choose certain people in my network and flick their blog posts into another corner. Then I want to flick the podcasts of those who I know to be quite eloquent with the spoken word (or have an ear for it anyway) into a third corner. The fourth corner I will save for images of those who seem to have an uncanny knack for finding the best arguments through pictures.

I want to be able to play a podcast while looking at the images and then comment on each one directly as I go through. I want to be able to watch a video and then step into the twitter conversation that it sparked. I want to be able to see the blog posts highlighted with everyone’s annotations and then copy and paste my favorite parts with a few of the images that I found and link it together with the videos that are going around in the network as well.

This is what scraping will do for us in the not so distant future. We will be able to remix any type of content into a new one with as few steps in between as possible.

We will know we have reached the point of truly enlightened scraping when we no longer have to care where things are posted (facebook, twitter, buzz, flickr, etc.). We will simply see the people in our network and we will be able to literally grab ahold of what they have shared and put it to our own uses. This will be the future, and this will be now, too.

Enhanced by Zemanta

Leave a Reply