Dec 28, 2011

Something Completely Different…

From #SecondLife to Second Fiddle in 2012

 

Is it a surprise that @Rodvik wants to focus on making video games?

 

I’ve been quite busy lately with research and development on some projects, so quite noticeably people have asked if I’m still around. The short answer is “yes”, though some other things have been keeping my attention other than Second Life. I won’t go into details about what I’ve been focusing on, but I will say that the technologies are really cool.

 

More importantly, however, I’ve been keeping an eye on the recent events concerning Linden Lab and Second Life through the usual channels, such as New World Notes, and more specifically the following little gem of news from them made me laugh.

 

 

rod-humble

 

 

 

I’m in no way surprised that Rodvik has decided to focus on video games that are unrelated to Second Life and “expand” Linden Lab’s offerings past their flagship namesake. He’s a video game executive that was hired to run a Metaverse company, and I expected he wouldn’t do so well at this job.

 

Now, when I say he wouldn’t do so well, what I implied many months ago is the only thing this man apparently knows how to do is create video games, which is an entirely different market than managing and growing a metaverse. It’s a lot like the difference between a Bugatti Veyron and a Tricycle.

 

The Metaverse is the high performance, open ended, user created masterpiece of virtual environments. A tricycle entertains you for a few hours without actually adding any culture or wider appreciation.

 

Rodvik is a master of making tricycles.

 

That isn’t to say that there isn’t money to be made in making tricycles, or that Linden Lab couldn’t be profitable by making and selling those tricycles, but that isn’t what Linden Lab as a company is about. At least, that’s not what it should be about. Linden Lab was and is a metaverse company, with a flagship product and culture that is Second Life. When a CEO decides that the flagship product of the company is not worth the effort to give his or the staff’s full attention to, but instead decides that the future of Linden Lab is to pretend to be Zynga… there’s something seriously wrong.

 

Let’s face it, Rodvik has Farmville envy.

 

 

Farmville HD

 

 

Rodvik’s assurance that there will be continued updates and functionality additions for Second Life is not comforting, because we already know what it’s like when the Lindens aren’t able to address the JIRA, fix bugs, or ultimately create a viewer that isn’t half assed and broken. There is no reason to believe that splitting the focus and resources of an already stressed team so they can make unrelated video games will have any effect other than to justify less support for Second Life going forward in favor of coddling whatever video games Rodvik wants to roll out for Facebook instead. This isn’t speculation, this is simply common sense management in knowing that splitting an already overworked team to focus on unrelated projects will mean:

 

1. The new products will be poorly executed.

2. The original product will suffer from poor execution.

3. Both areas of focus will suffer from mediocre execution at best.

 

Because his focus is on the new games, I’m likely to say that #2 is the most plausible outcome, which means that Second Life (as far as he’s concerned) is now Second Fiddle.

 

 

The decision he has made is the executive equivalent of changing the focus of the entire company to something unrelated in order to mask incompetence in the original venue he was hired for. If he cannot expand or properly manage the flagship product, but he knows how to make cheap little video games in spades, the latter will be his and the company’s new focus to make their projected growth going forward. Like I originally said when he was hired – when you’re a hammer, everything looks like a nail. We already went through this scenario with Mark Kingdon where he decided the focus was on enterprise servers, and if I remember correctly, through all of this divided focus, that didn’t bode well either.

 

Now, I could theoretically interpret Rodvik’s direction as a skillful manner by which to focus his and the company’s attention on rebuilding a better version of Second Life for the long run. Something from the ground up, or a total overhaul into a new system, which later he will encourage migration from Second Life to the new system as a result.

 

I could interpret it that way, but I choose not to do so simply out of common sense. If this were the focus, he wouldn’t have said “… unrelated to Second Life”.

 

Today marks the day that Linden Lab has begun its transformation into just another Zynga wannabe. Rodvik never really left his old position as a VP at Electronic Arts working on The Sims. He simply carried his old job into his new and unrelated job. In the process, instead of focusing on the job he was given, he’s decided that the focus of the entire company should instead cater to his whims and expertise.

 

In the process, he’s even practiced some great acts of cronyism by surrounding himself with video game executives and industry names to justify his reasoning why Linden Lab needs to pretend to be Zynga and pay less attention to it’s own flagship product and community.

 

From day one, all Rodvik saw as the CEO of Linden Lab was this:

 

 

 

Sims3Plumbob

Click the picture for a link to this item on Marketplace.

 

 

The real question in the end is two-fold:

 

Why did anyone not expect this to happen from day one?

 

And more importantly

 

What makes anyone think that this is actually good for Second Life?

Dec 16, 2011

3D Virtual Worlds and the Metaverse:

Current Status and Future Possibilities | #SecondLife

 

On December 1st, 2011 there arrived in my email quite astounding news from Dr. Dionisio and Dr. Gilbert that the Association for Computing Machinery had not only accepted our research paper 3D Virtual Worlds and the Metaverse: Current Status and Future Possibilities for publication in their next  journal, but that the reviewers had given the best praise of the paper that the editors had ever seen. As I read through the critiques from each reviewer, I was humbled by their responses.

 

As Dr. Gilbert succinctly put it:

 

After my recent return from the European Summit of the Immersive Education Initiative (I'm still jet lagged as you can see from the early hour I'm writing this!), Dondi sent me the news that ACM Computing Surveys had accepted our paper with relatively minor revisions.  Even more significantly, the impressions of the editors and individual reviewers of the paper were outstanding with all the reviewers suggesting that the work be considered for a best paper award, the main reviewer viewing the work as a  "seminal" contribution to the field, and the Associate Editor calling the first round reviews the best he's ever seen! 

 

What I've also learned is that ACM Computing Surveys is one of the most cited and respected journals in the field of computer science (I had no idea that Dondi had decided to reach so high in submitting the paper). To get accepted into this journal, especially with such high praise, is a great honor and once the article is published our work will be widely read and could have a real impact on the field.  After we submit the revisions in the next few weeks you can crank up all your social media expertise and begin to distribute the article as "in press." 

 

 

LAVA Home of the Future

 

 

 

It’s definitely an honor, and quite humbling to receive that caliber of review from not one but four independent sources across the industry. My understanding of the process was that we should have expected multiple rounds of revisions before being considered for publication, so having it accepted immediately after the first round of peer review is astounding.

 

In the words of the Associate Editor at ACM:

 

These actually are the most positive first-round reviews of a journal article I have seen. If you revise this and include a document that explains how you did, and did not, address comments, I will not need to send this out for further review if it looks satisfactory to me.

 

Let’s put aside the banner waving for a moment, because despite how groundbreaking this all is, it is only a prelude to a bigger picture concerning the Metaverse as I usually provide in articles like this.

 

The Down Low

 

There is a lot of subject matter covered in the paper for ACM Journal, but the title says it all: Current Status and Future Possibilities.

 

Essentially, what we wrote was a comprehensive survey paper looking into the various areas of virtual environments, or immersive environments as some refer to it, and addressed the fundamental challenges and technologies that can further the progress. There is also discussion in the paper concerning what areas are deficient and in need of further research initiative in hopes that those areas will be focused on by some of the countless people reading and citing the paper.

 

Literary influences that led up to the current state of affairs are also outlined, in order to give a proper history of the subject while framing the further understandings as the paper continued on. As one reviewer put it, we presented the most comprehensive paper on the subject that we could short of writing an entire book about the subject.

 

As the message from Dr. Gilbert suggests, I will be making the entire paper available as a PDF when it is sent off to ACM for publication. While there are a lot of topics covered in the paper,  I’ll only be addressing one of them here in this blog.

 

The Take-Away

 

One of the biggest things that can be taken from the paper concerning the future direction of the Metaverse (in my opinion) is the implicit understanding that centralized networking is our Achilles Heel. It always has been, even before we really got into the hardcore 3D Immersive Environments like we see today. All the way back to Lessons from Habitat in 1991, Bandwidth being a scarce resource has always been a concern in something like this, and centralized networking was pegged as a culprit.

 

Yes, we can alleviate that stress by offloading across a datacenter, or buying more powerful servers. But the limitation is still there, and all we end up doing in the process is kicking the can down the road instead of actually trying to solve the problem at the root.

 

Whatever the future holds for immersive environments, it is likely that decentralized networking will be a major part of it in stark contrast to our massive datacenters today and centralized bandwidth.

 

To Boldly Go…

 

There is also a hint toward the data itself, in that the distribution of that data would likely not be the same as how we are accomplishing this task today. I’d like to say that in order for a powerful Metaverse to really evolve, the data has to be agnostic. The best way to understand this is to imagine the replicators in Star Trek and how they work.

 

When the captain walks up to a replicator and says “Tea. Earl Gray. Hot.” the replicator isn’t looking for a cup of tea in its inventory. Nowhere on the ship is there a room with thousands of cups of hot tea on a shelf waiting to be beamed to the replicator. Essentially, there is a holding tank of sorts on the ship which contains the raw molecular “soup” of materials, and the replicators are simply using a digital recipe to pull the components and “replicate” that molecular recipe.

 

 

Janeway and the Replicator

A replicator works almost like magic. Just like the plot to every Star Trek series.

 

 

If we apply this understanding to purely digital files, the same process works. Let’s say that the replicator is a software program on your computer whose sole purpose is to reconstruct digital files based on digital fingerprints (keys) that you give it. Since we’re already talking about a decentralized network as our future, the “holding tank” full of agnostic data to be used for those reconstructions of files should be spread across every user of the network.

 

Why not? We all have a cache folder sitting on our computers from virtual environments and web browsers. What if that cache folder was simply agnostic data and available to the entire network of users in a P2P manner?

 

First off, it would mean that our cache’s are meaningless to human interpretation. Completely random data not representing anything in particular until we give it the fingerprint to reconstruct the file we need. Secondly, it would mean that the data becomes multi-use data.

 

That last point is very important. It means that 1GB of multi-use data represents or contains any file that is 1GB or less. We can say that Photoshop CS5 is under 1GB for the installer, correct? So that multi-use data would contain Photoshop CS5, and anything else that is 1GB or under. The key, so to speak, is that while such a system just about invalidates the premise of current copyright law, making it impossible to prosecute somebody using such a system, it brings up something even more interesting.

 

11 Herbs and Spices

 

It’s not the data that is really copyrightable, but the means by which such data can be arranged in a unique configuration as a representation of a work. What is really copyrightable happens to be the unique fingerprints which tell the system how to reconfigure multi-use data for a specific file output that is no longer multi-use, or in layman terms – it’s the digital recipe.

 

Another interesting side effect to multi-use data is that you’re not storing a 1:1 representation of anything. Since the data is multi-use, that means (for the most part) that you don’t need the storage equivalent for a copy of every single file in the system. All you really need at the very least is enough agnostic data available in order to cover the largest file on the network.

 

If the largest file you have on the network is about 50GB (and I’m being very generous here), then on a multi-use data system, you would need no more than 50GB of data total for everything you ever store into that network. That’s the bare minimum. Yes, at that level it would take longer to reconstruct the data, but it would still work. the more agnostic data you have in storage, the faster that reconstruction goes. Think of it like over unity…

 

Needless to say that a multi-use data structure in combination with decentralized networking would mean that, in example, the entirety of the Second Life asset servers could be housed in about 100 terabytes or less (for redundancy), and the reliability of that system would mean that the odds of anything in your inventory ever disappearing would be so close to nil as to make no odds.

 

And that, is how you solve the root of a problem instead of addressing only the symptoms.