Skip navigation

Tag Archives: linkedin

birds

While there has been much breathless speculation about the threat that Twitter poses to Google’s iron grip on the search market, I haven’t seen many specific ideas about how that would work. By now we all know that searching real-time content is an increasingly valuable resource for information, but its utility seems to be limited to certain kinds of information like breaking news events and consumer product reviews.  It certainly doesn’t appear to be as flexible a mechanism for locating the wide range of content that traditional search has been since the inception of the world wide web.

The most intriguing speculation that I’ve read about the future of search has been this post about the notion of ‘PageRank for People’. In essence, the idea states that the current algorithms that govern the ranking of search results are inadequate because they rely too heavily on the location of the content being listed. Since Google relies on the volume of inbound links to judge the value of content, it favors content posted in popular locations. The thing is, a piece of good content is just as valuable if it is posted to ‘Bob’s Blog’ or the ‘New York Times’. The solution that was proposed, was a system that factors the ‘reputation’ and ‘authority’ of the content’s author when ranking search results. Just how to calculate these numbers, though, is the tricky part. One answer might be Twitter.

There are two aspects of Twitter that need to be changed. Both the ‘Retweeting’ and ‘Hashtag’ behaviors need to be provided as official features of the service. That means the mechanism for these actions needs to be separated from the 140 character text string of each tweet. That is to say, we should lose the ‘RT’ syntax and make the identity of the original poster some form of metadata that exists outside the post itself. Likewise, tags like those currently labeled with hash-signs ‘#’ should be saved as metadata separate from the actual tweet. With those two changes, searching the web could become a lot more useful. Here’s how it would work:

When I publish a post to Twitter, I should have the ability to tag that post with semantically accessible identifying labels. Each post associated with a given subject (as described by the label) potentially contributes to my ‘authority’ about that subject. Now, when someone retweets my post, they are in essence endorsing what I have said and contributing to their own ‘authority’ about that subject. If there was a scoring mechanism that assigned, say, one point for each endorsement, then you’ve created a system that establishes quantitative values for ‘authority’. Imagine it this way:

tweets

For each retweet, additional points are cascaded up the tree. That way the original poster is always given the most credit for contributing the idea, but those that help propagate it are given credit as well.  Authority is defined by the community, not the individual. It should be pointed out that for each subsequent retweet, the poster will have the opportunity to revise the tagged metadata, either adding more detail, or removing labels they believe are not appropriate. That way, the system guards against abuse by so-called trend-squatting.

Now, once we start getting values assigned for Twitter users’ authority on specific topics, search engines can start factoring this in to their rankings. So, content authored by an individual with higher authority for the subject of that content are favored over others. Content authored by organizations might be scored, in part, by the collective authority of that organization’s members. It would create a tremendous upward pressure to contribute value to the community. For some industries, I imagine that one’s scores in this respect would become a factor in employment decisions or compensation levels.

Now,  what I’ve described would need to be only one part of the search ranking algorithm. As described by the original post, there are many other factors that should be considered. Additionally, the scoring mechanism described above is probably far too simplistic and vulnerable to abuse. For example, one complication that could make the system more reliable would be to consider the reputation of the endorsing party when assigning a value to the score that their retweet provides the original poster. That way, it is more valuable to get retweeted by individuals with more authority. Note that ‘authority’ as I’ve described it is an entirely separate metric from ‘popularity’, which is defined by the number of followers that a user has.

There would be many additional side-benefits of such a system. For example, much of what is posted on Twitter are links to content elsewhere on the web. A robust labeling function would turn Twitter into a tagging system for the entire semantic web.

Here is a related discussion.

The flood of activity surrounding the turmoil in Iran on Twitter the past few days has got me thinking about ways to make the platform more useful when big news events occur. Much has been said about how this episode is demonstrating the failure of traditional media, as real-time media has now become the main source of breaking information in Iran. There is no doubt that Twitter, Flickr, YouTube and the like are now a very important resource, but I think it’s important to recognize what we might lose if we completely throw the CNN model overboard. Important services that traditional TV news outlets provide include: intelligent filtering of the raw data stream, and a format that makes commentary accessible. TV news coverage does these things well, in that there are news directors who prioritize information, and a presentation format that weaves background information and editorial content together in a digestible package. These piece have been notably absent from the ‘coverage’ provided by #IranElection.

anchr1

Here is an idea that occurred to me, for a specialized blogging platform that would allow anyone with sufficient passion or knowledge of an event to produce their own live news coverage of a story, broadcast it to the web, and take advantage of all these amazing real-time media tools. Imagine a website, let’s call it Anckr.com, that you could go to when a story breaks. If you wanted to provide real-time  news and analysis of the story, you could log in via Oauth with your Twitter credentials.  You would then be presented with an administrative interface where you could set variables like: The Headline of the story, a friendly looking URL and a hashtag for your news coverage.

You would then have the opportunity to tweet directly from this interface with commentary on the event. In addition to your regular Twitter feed, your tweets would show up on the public facing news page that you’ve generated (pictured below in section ‘A’). In this respect, you would be serving as a news anchor for the story. As anchor you would have the opportunity to populate your news page with real-time content from many different sources. For example, any relevant photos or videos could be embedded into section ‘E’. You could also display relevant news feeds from traditional news outlets there (if they are bothering to provide coverage at all). This section could also provide space to display a feed of viewer beedback, that could be populated by tweets from viewers who have tagged their posts with the hashtag for your coverage.

anchr2

If you wanted to provide more compelling updates, there could be a section of the page that allows you to display a live video or audio stream of yourself (section ‘D’) providing commentary. Relevant hashtags could be displayed easily (section ‘B’), providing links to more conversations about the event.

Most importantly, there would be a section where you could provided a moderated feed of miro-blogging content (section ‘C’). The Anckr interface would allow you to follow individual tweeple who you believe are valuable sources. For example, you could have a feed for ‘witnesses’ of the event, people tweeting from the scene. Or, you could have a feed comprised of knowledgeable commentators.

Now, I could imagine traditional news organizations using a branded version of this sort of interface. Alternatively, anyone who chose to start a page, could become their own ‘reporter’. Since no one can provide 24 hour analysis by themselves, it would be cool if there was a mechanism that allowed you as the anchor, to hand off control to someone else of your choosing, kind of like how BNO news does now on Twitter.

All of the tools for creating such a website exist. I’ve listed some of the resources above, including: Twitter, FriendFeed, UStream, YouTube, Flickr, TweetCloud and Paratweet. Thesea re just some examples, but I’m sure there are many others.

The closest thing that I’ve found is almost.at, which is very cool, but more of an automated filtering service. If you are aware of anything like what I’ve described above, please let me know. If I had any free time, I might build this myself. Since I don’t, I hope someone gives it a try.

Topics include: Twitter TV, Trending topics, and the Twitter Apocalypse

ThotCast ep 7: The Comeback Episode

or Subcribe the podcast feed directly. OR get it from itunes

Topics include: Ashton v. CNN, Twitter adopts new open 3rd party authentication mechanism called ‘Sign In with Twitter’, Tweetie for Mac launches

ThotCast ep 6: The Authentic Episode

or Subcribe the podcast feed directly. OR get it from itunes

 

Topics include: Twitter helps a revolution in Moldova? Spam virus outbreak. Twitter browser war escalation between Twhirl and Tweetdeck, but I like Nambu.

ThotCast ep 5: The War Episode

or Subcribe the podcast feed directly. OR get it from itunes

TV is in trouble. DVRs like TiVo neuter the efficacy of advertisements on broadcast outlets and the internet is threatening the subscription model that supports cable. Marketplaces like iTunes and Apple TV are becoming increasingly attractive to consumers who question the value of paying for access to hundreds of channels that they don’t watch. And, despite experimenting with streaming services like Hulu, content providers are getting cold feet now that technology has migrated from the computer screen to the TV itself.

Cable stalwarts like Mark Cuban argue that internet-based al a cart services, like those provided by iTunes, Hulu and Boxee will not be able to compete with cable in terms of price. Cuban believes that the current prices that are charged to purchase commercial-free TV shows are artificially low because the business model is subsidized by revenue from cable subscriptions. Take away the cable TV business, he says, and suddenly the internet becomes a lot more expensive and less attractive to consumers.

The problem can be summed up like this: The internet doesn’t seem to be as profitable as traditional TV, but traditional TV is starting to wither. This is a lot like the problem newspapers face, it’s just earlier in the process. What to do?

In earlier posts, I’ve argued that the answer lies in getting creative. Thinking about this tonight, I was trying to imagine the ideal TV setup for me as a consumer. Trying to imagine how it would all work, not just for me, but for the advertisers as well. The following is just part of the system I imagined, and I’ve not heard this proposed before. So, in the spirit of creativity, I offer the following idea to the industry:

The Consumer Directed Content Pricing Model

Here’s how it works. I have a set-top box, let’s say an Apple TV. With this devise I can do all the stuff I can currently do with my Apple TV: purchase or rent TV episodes and movies, download podcasts etc. But, when watching a program, I have options about how to pay. As shown in the following mock-up, the interface provides me a slider that lets me set the price I’m going to pay to purchase the program.

dreamtv1

I could choose to pay the full price of $12.99 to download and keep the program comercial free as a can on the current Apple TV. Or, I could opt to pay nothing, and get the program with a full complement of commercial breaks. The viewing experience would be much like watching the show on broadcast television in this case, except I can time shift it and watch it as many times as I want (as with a DVR). Unlike a DVR, I would be forced to let the commercials run in their entirety, much the way HULU requires on their website. Alternatively I could move the slider to some mid-point between the two extremes. This would allow me to pay a modest price in exhange for fewer commercials. 

dreamtv2

The more I’m willing to pay, the fewer commercials I’ll get and the shorter the running time will be. 

Now, to make this practical, the ads should not be part of the video download. Just meta-data that marks the insert points for them in the timeline of the program. When a commercial break occurs, the ads are streamed in from the web. That way, advertisers can insert timely ads into a viewing, even if the consumer purchased the program some time ago.

As a consumer, I like this, because it gives me flexibility and control to watch what I want, whenever I want, and pay whatever I think is fair. As an advertiser, I like this because I get all the benefits of the traditional broadcast ad model, avoid the pitfalls of DVRs, and can target my ads to viewers based on the psychographic profile generated by my viewing habits. 

The only folks this model cuts out is the broadcast network. They’re screwed. I’m sure cable providers will hate it to, since they’re determined to be involved in my content decisions. But, hopefully, they will embrace the inevitability of their utility status, and focus on providing us consumers the most excellent dumb pipe screaming fast internet connectivity possible. 

Does this model solve the problem? I’m sure it’s not that simple, but this is the sort of creative thinking that I’m not seeing from most of the players in the industry. One notable exception is Boxee, a company which at the moment seems more interested in improving the Apple TV device than Apple does.

There are other things that I want from TV as a consumer, but I’ll save those ideas for future posts.

UPDATE: Since writing this, I found this post that describes a similar idea. Cool.

 

Topics include: Twitter and Google, sittin’ in a tree (t-a-l-k-i-n-g), search functionality on Twitter, technical problems on twitter and backing up your twitter data, Demi saves a life?, Italian earthquake debuts on twitter.

ThotCast ep 4: The Trouble Episode

or Subcribe the podcast feed directly. OR get it from itunes

minimob

One of the lines I like to use when discussing how media is changing these days is, “The internet is punishing inefficiency.” I’ve forgotten who I lifted that line from, but I now add the following chaser: “The economic downturn is accelerating that process.” There is no doubt that the ways in which people are consuming information are changing rapidly. Newspapers are reeling, the recording industry has succumbed to the inevitable, and TV providers are getting nervous. Likewise, the industries that rely on gaming consumer attention, namely Advertising and PR, are scrambling to understand what their jobs are anymore. Part of the problem they face is that they are chasing a moving target, and no one is sure where it is headed. 

I tweeted a few weeks ago, that I thought the challenge of deriving value form Twitter was largely a problem of managing the ‘Signal to Noise Ratio’. There’s a lot of useful content out there, but if it isn’t parsed from the firehose for you, it’s of little value. This of course is why we have a media in the first place. The role of writers and broadcasters has always been to parse the wide world of ideas and bring us the nuggets that we value for their news or entertainment value.

In the beginning, ideas were spread sideways. That is to say we communicated almost entirely by word-of-mouth, and memes reproduced via a cascade of parallel paths. The trouble with this method was that it was slow and prone to errror.

Next, technology made a top down model practical. As it became economically feasible for small groups to communicate with large groups using things like print and broadcasts. This was the birth of ‘media’ and it dealt with the speed and error problems and was tremendously useful. The problem with this model, however was that the masses were all largely subjected to the same set of information, regardless of individual need. This system also made public attention vulnerable to manipulation, a loophole that the marketing industries exploited for centuries. 

Then another technological advancement made it practical for anyone to communicate with everyone. This made it much easier customize sets of data and package them for small audiences. The problem became that firehose I mentioned a bit ago. Suddenly we were back to the beginning of this cycle again. At first there was a sideways model online whereby users relied on tools like email to move the stuff around. Then a top down model arose as publishers and broadcasters attempted to apply their skillset to the web.

Then, for the first time, the web started to try something new. Taking advantage of the technology, models for parsing information were attempted that allowed the masses to self-regulate the firehose. For the first time we tried allowing our peers to do the parsing, leaning on our collective wisdom, services like Digg and Reddit emerged to help tailor content to individual need. The shortcomings of these so-called ‘democratized content forums’ was that they created a disadvantage for individual’s whose content needs stray far from the average. 

About two years ago I speculated about what the next model would be. At the time I became enamored with the idea that so-called ‘prediction markets‘ might just be that next model (see also ‘Infotopia‘ for a good discussion of this). It turns out (current economic crises notwithstanding) that stock market mechanisms are extremely efficient tools for extracting wisdom from groups. This fascination of mine resulted in an experiment called ThotMarket.com which was my attempt to apply the model to news and information. I now beleive I miss-read the trends. The actual next model is that which is being popularized by Twitter.

The innovation that this micro-blogging model provides is that content is parsed by the collective wisdom of the group, much like the voting forums, but it is a group of my choosing. As a Twitter user, I manage my own mini-mob, and that mini-mob manages my content for me. 

Now, managing the mob is not an easy thing to do. At the moment the tools available to accomplish this on Twitter are crude and cumbersome. But, these will improve as Twitter improves itself, or someone comes along to build a better micro-blogging platorm. Much of the attention of Twitter appliction developers so far has been focussed on creating new tools to parse out content directly from Twitter. There are an ever-growing library of live-streaming, filtering, and searching interfaces for Twitter. These are welcome and valuable additions to the service, but I think they miss the mark a bit in terms of taking full advantage of this new form of media. By foccussing on the content of the tweets themselves, most do not directly factor in the parsing ability of individual people. I’m waiting for more sophisticated tools that will allow me to find new tweeple, based on expertise, interests and quality. I want these tools to also help me prune the herd of tweeps that aren’t holding their own. In essence, I want tools that will help me make my mob better than your mob. 

There are some tools like this now. Mr Tweet comes to mind, as does Kevin Rose’s new pet WeFollow. These are good starts but leave a lot to be desired. For example, WeFollow is very comprehensive, but relies solely on the number of followers a Twitter user has to rank the list in each category. I think that this metric is a terrible measure of the quality of users for two reasons. Firstly, it’s too easy to game the system and run up that number without regard for merit. Secondly, it confuses quality with popularity skewing attention toward the center. That’s the same problem that Digg has, as I’ve already described. What is needed, is a more useful metric to measure, compare and match users up with each other. What does that look like? Well, I have at least one idea and will be posting much more about that in the future. 

So where does this all leave marketers? In many respects they are having the rug pulled out from beneath their feet. The top-down loopholes are becoming less and less effective. In many ways, they are being treated like the uninvited intruder in the new media party, trying to butt into the conversation and unable to do anything but annoy the consumer partygoers. Is all hope lost? Hardly. I think that with creative thinking, a firm understanding about what is actually happening, and a healthy respect for what consumers actually want, that industry will improve right along with the media. Speaking as a consumer, I don’t mind if marketers participate in the conversation I’m crafting for myself online as long as their actions provide value to me. If they don’t my mob will take care of it.

Topics include: Twitter growth, trouble with TwitPic, and researchers have found a way to ‘de-anonymize’ your Twitter profile based on social media connections. Find links mentioned in the show on the Thotcast twitter feed @thotcast

ThotCast Episode 2: The Paranoid Episode

or Subcribe the the feed. We’re still awaiting iTunes approval.

I’ve decided to give podcasting a go. The show is ThotCast and is designed to do a quick roundup each week of news and events surrounding Twitter. I’m planning to keep each update less than 140 seconds long. Subscribe to the feed and check out the inaugural show here. I’m begining the process of getting this up on iTunes, but I’m told that the approval process may take a bit of time. I’ve established a Twitter feed for the show that will index links related to the stories discussed and field feedback form listeners. Let me know what you think.