Thursday, January 21, 2010

Giz Explains: SSDs and Why You Wish You Had One












Speed. Toughness. Efficiency. Silence. That's why we want solid-state drives in our computers. But we worry about the zoom-zoom performance degrading over time, and the fact that SSDs might eventually wear out. Here's what you need to know about 'em.

Why Solid-State Drives Are Awesome (Or At Least, Better Than Hard Drives)

To understand what's great about SSDs, let's start with HDDs (you know, old-fashioned hard drives). On a basic level, a hard disk drive works thusly: Inside is a magnetized recording surface called a platter that spins around really fast, with a head that zooms across disk to read and write data—think kinda like a record player, except the head never touches the surface, 'cause that would be very, very bad. So, you can see the problem with hard drives: They're fragile (don't drop your computer) and they're slow to access stuff because the head has to physically move to where the data is.













With an SSD, on the other hand, we're talking straight silicon. What's inside is a bunch of flash memory chips and a controller running the show. There are no moving parts, so an SSD doesn't need to start spinning, doesn't need to physically hunt data scattered across the drive and doesn't make a whirrrrr. The result is that it's crazy faster than a regular hard drive in nearly every way, so you have insanely quick boot times (an old video, but it stands), application launches, random writes and almost every other measure of drive performance (writing large files excepted). For a frame of reference, General Manager of SanDisk's SSD group, Doron Myersdorf, says an equivalent hard drive would have to spin at almost 40,000rpm to match an SSD. And, you can drop it—at least, a little.

Secrets of the SSD

Typically, what you've inside an SSD is a bunch of NAND flash memory chips for storage—the same stuff found in memory cards and USB thumb drives—along with a small cache of DRAM, like you'd find on most current hard drives. The DRAM is also flash memory, but the difference between the two is that the storage memory is non-volatile, meaning the data it holds won't go poof when it loses power, while the faster DRAM is volatile memory, so "poof" is exactly what happens to DRAM data when the power goes out. That's fine because it's the faster DRAM is just for caching things, holding them temporarily to make the whole system work faster.

So, let's talk a bit about flash memory itself. I'll try to keep it straightforward and not lose you, because it's key to the benefits and problems with solid-state storage.

Flash memory is made up of a bunch of memory cells, which are made up of transistors. There are two basic kinds of memory: With single-level cell (SLC) memory, one bit of data is stored per cell. (Bits, the basic building block of information, if you recall, have two states, 0 or 1.) The SLC type is fast as hell and lasts a long time, but it is too expensive for storing the dense amounts of data you'd want in a personal computer. SLC memory is really only used for enterprise stuff, like servers, where you need it to last for 100,000 write cycles.

The solution for normal humans is multi-level cell memory. Currently, up to 4 bits can be stored per cell. "Multi-level" refers to the multiple levels of voltage in the cell used to get those extra bits in. MLC SSD drives are much cheaper than SLC but are, as I mentioned, slower, and can wear out faster than their pricier counterpart. Still, for now and going forward into the foreseeable future, all of the SSDs you could come close to owning are of the MLC variety.

The Bad Stuff

Structurally, flash memory is divided into blocks, which are broken down further into pages. And now, we get into one of the major problems with flash. While data can be read and written at the individual page level, it can only be erased at the larger block level. In other words, suppose you have a 256k block and a 4k page, and you want to erase just one page worth of data, you have to erase the whole block, and then write all the rest of the data back to the block.

This is a huge problem, for one, because MLC flash memory wears out after 10,000 write cycles. Two, as the drive fills up, performance significantly degrades. (Anandtech has a pretty great illustration, amidst a massively deep dive on SSDs you should read if you're interested at all, showing this.) That's because without free blocks to write to, you've gotta go through that intensive erase and rewrite cycle, which, as you'd imagine, entails a lot of overhead. Problem numero three is that, according to SanDisk CEO Eli Harari, there's "a brick wall" in the near future, when storage at the chip level could stop increasing in the not-too-distant future.

Mitigating the Bad Stuff

The thing is, you actually probably still want an SSD in your next computer, to make it run awesomer. Because where there are problems, there are sorta solutions. Remember how I mentioned up above the other major component in an SSD, besides the flash memory, is the controller? They're a big part of what differentiates one company's SSD from another's. The controller is the secret sauce, as SanDisk's Myersdorf told me. Because the game, for now, is all about managing flash better, both physically and logically. In other words, it's about algorithms.

The first standard technique for long flash-memory life is wear leveling, which is simply not writing to the same area of the drive over and over again. Instead, the goal is to fill up the entire drive with stuff before you have to start erasing blocks, knowing that erasing and re-writing will use up precious cycles. The problem of "Write amplification"—say you have a 1MB document that ends up causing 4MB worth of writes to the drive because of the whole block and pages problem described above, where you wind up reading, erasing and re-writing a bunch of extra blocks and pages—that is being lowered, says Myersdorf, because drive management is shifting from being block-based to page-based. More granular algorithms with caching and prediction means there's less unnecessary erasing and writing.

The biggest thing is what's called TRIM. As you probably know, when you delete something from your computer, it isn't instantly vaporized. Your OS basically just marks the data as "Hey it's cool to pave over this with new stuff." Your hard drive has no real idea you deleted anything. With the TRIM function, when you delete something, the OS actually tells the SSD, "Hey you can scrub this crap." The SSD dumps the block to a cache, wipes the pages with the stuff you want gone, and copies the stuff you want to keep back to a new block, leaving you with clean pages for the next time you want to write something to the disk. This means better performance when you're saving new stuff, since it handles the read-erase-rewrite dance ahead of time. Windows 7 supports TRIM, and Myersdorf says Windows 8 will be even better for solid-state storage.

As for busting through the brick wall of limited storage, the number of electrons that can reside in a cell, increasing flash memory storage at a pace faster than Moore's Law, right now, Toshiba, who invented NAND flash, is currently the chip capacity king. The company just announced a new 64GB NAND flash module that combines 16 4GB NAND chips. This would seem to be closing in on that wall, which we don't want them to do, because we want the dollar-to-MB ratio to keep dropping. Myersdorf is optimistic (despite his boss's gloomy pronouncement), "There have been several walls in history of the [flash] industry—there was transition to MLC, then three bits per cell, then four—every time there is some physical wall, that physics doesn't allow you to pass, there is always a new shift of paradigm as to how we make the next step on the performance curve."

Okay, the big question then: When are SSDs gonna get seriously affordable? A 160GB version of one of the one of the most acclaimed SSDs, Intel's X25, retails for $470. OCZ's Colossus is a verifiable brick of solid-state storage, and the 1TB model has an MSRP of $2200, though it's going for much more. By contrast, a 1TB WD hard drive is like from a hundred bucks on a bad day. Myersdorf says it's hard to say when the dollar to byte is going to go down absolutely, mostly because of supply and demand, but he did predict that a lot of "mainstream" laptops are gonna have 256GB SSDs in the next 18 months. Oh good, I'll be due for a new laptop right around then.

Video of the Week – The True Power of an Apple Newton



This is actually really good. When you watch this, keep in mind that this was announced May 1992, and came out August 1993.
Also keep amount its specs:

CPU: ARM 610 (RISC) @ 20 MHz
RAM: 640K internal, 4MB PCMCIA
Display: 336 x 240 reflective LCD
Interface: touch-screen w/ stylus
Ports: RS422 serial, Infrared
Expansion: one PCMCIA (Type II) slot
OS: Newton OS v1.05

When this came out it was really ahead, and some analysts may say TOO ahead, which makes sense why this product did not do so well (and maybe the $699 price tag). The market was not ready for an all in one PDA/Note Taking Device.

Another thing to realize is that the hardware really wasn't the most advanced during its time. As unsuccessful as this product was, it does show what Apple is all about and that is implementing the right software.

It proves that the Apple Tablet will have revolutionary software if it is going to be start or re-revolutionize the market. Just like the iPhone, iPod, and the Machintosh.

OnLive Beta gets a preview, lukewarm approval


We've now pretty much reached saturation point with OnLive demos, so it's good to finally see an independent set of eyes poring over the service and giving us the lowdown on the actual user experience. Whether you call it on demand, streamed, or cloud gaming, the concept is remarkably simple -- OnLive pumps games via a web browser onto your machine and gives you the full gaming experience without the need for all that pretty, but expensive hardware. PC Perspective's Ryan Shrout "found" a login to the Beta program and has put together a very thorough comparison between OnLive and playing the games locally on the same computer. His conclusion is that latency issues at present make an FPS like Unreal Tournament unplayable, but slower input games like Burnout Paradise or Mass Effect give pleasingly close renditions of the real thing. We encourage you to hit the source link to see side-by-side video comparisons and more in-depth analysis.

Palm Pre Plus shows off multitasking upgrade with 50 simultaneous apps (video)


Yea, you read that right -- fifty apps loaded side by side by freaking side on the Pre Plus, and the thing just kept on ticking. The chaps over at Pre Central decided to test out specifically how much of an improvement the doubling of RAM and storage in the new handset delivered, and they were not disappointed. Opening up the same apps on both phones, they found the original Sprint Pre (sporting a mere 256MB of RAM) ran out of puff at the 13 app mark, whereas the Pre Plus soldiered on until a nice round fifty was reached. Go past the break to see the video evidence for yourself -- long live multitasking!


Apple rumor roundup: pipe dreams, Lala's role and Verizon's iPhone 4G

In case you haven't noticed, things are getting out of hand in the world of Apple rumors. Frankly, it's all we can do to read another one and trudge onward, but hey -- we've no problem with folks putting their reputations on the line here. Let's dig in to the latest pair, shall we?

The rumor: Apple's acquisition of Lala will actually lead to customers having access to an "online locker" for multimedia. This could be a cloud storage location for one's iTunes library, enabling them to have access to their jams and vids even when away from their at-home storage. The trick is that the cloud would only hold the metadata, and streaming would originate from somewhere else on Apple's end.
Our take: Okay, so we want to believe. Just imagine if your next Apple tablet or iPhone knew exactly what songs you owned in iTunes, and at a moment's notice, you could tap into the iTunes store and stream full, unedited versions of those songs from anywhere. Amazing, no? Problem is, the bulk of iTunes libraries aren't made up of content that was purchased in iTunes (or purchased at all). It seems that the best Apple could do would be to negotiate streaming deals for content you've actually purchased within iTunes, which results in a half-baked user experience. Last we checked, Stevie J wasn't much on half-baked user experiences.

The rumor: Astoundingly, the mythical Apple tablet won't be the company's "one more thing" next week; instead, it'll be a refreshed iPhone... that works on Verizon Wireless. Oh, and iPhone OS 4.0. So says Canaccord Adams analyst Peter Misek, anyway.
Our take: Ha! Apple has never been one to showcase too much at one time, and we're guessing that the outfit would be smart enough to withhold a new iPhone introduction for a separate press event. We don't doubt that a Verizon iPhone is in the works (though an LTE version will be at least a year or two out), but there's no way Jobs steals the tablet's thunder by giving every rabid iPhone user hot sweats when considering the switch to Big Red. Bottom line? Don't bank on it.

At this point, we reckon everyone would be best served by taking a huge step back, a deep breath and one of those so-called "chill pills." Next Wednesday ain't so far away, now is it?

Google's HTML5 YouTube Videos Don't Need Flash


HTML5 is a major part of Google's plans for the future, including Chrome OS—check out this interview for more on that—and one step towards that is getting YouTube to work without a Flash plugin, which they've now achieved. It's not perfect yet (no ads or annotations) and it only works on certain supported browsers (Chrome and Safari, at the moment) but it's still a taste of what's to come. You can hit upTestTube to check it out. [YouTube]

Apple Puts Massive Delay on 27-inch iMac Shipments









Apple is quietly padding the buffer on new, completely stock 27-inch iMac shipments to 3 weeks, for reasons we assume are tied to their well-documented manufacturing issues.

(You've probably heard us talking about the iMac's production problems with yellow and flickering screens, but if not, follow the Faulty iMac Saga here.)

Notably, Apple has not delayed shipments on 21-inch iMacs, even though they, too, can be afflicted. In all fairness, however, I've found reports of 27-inch iMac problems to be far more prevalent.

While Apple hasn't released a statement as to the reasons for delays, we can only hope the company has decided to pin down whatever issues are occurring as opposed to mailing out more broken computers and hoping nobody would notice. [AppleInsider]

The Apple Tablet Interface Must Be Like This











Some people want the Apple Tablet to run Mac OS X's user interface. Others think its UI will be something exotic. Both camps are wrong: The iPhone started a UI revolution, and the tablet is just step two. Here's why.

If you are talking hardware, you can speculate about many different features. But when it comes to the fabled Apple Tablet, there are basically three user interface camps at war. On one side there are the people who think that a traditional GUI—one built on windows, folders and the old desktop metaphor—is the only way to go for a tablet. You know, like with the Microsoft Windows-based tablets, and the new crop of touchscreen laptops.

In another camp, there are the ones who are dreaming about magic 3D interfacesand other experimental stuff, thinking that Apple would come up with a wondrous new interface that nobody can imagine now, one that will bring universal love, world peace and pancakes for everyone—even while Apple and thousands of experts have explored every UI option imaginable for decades.

And then there's the third camp, in which I have pitched my tent, who says that the interface will just be an evolution of an existing user interface, one without folders and windows, but with applications that take over the entire screen. A "modal" user interface that has been proven in the market battlefield, and that has brought a new form of computing to every normal, non-computer-expert consumer.

Yes, people, I'm afraid that the tablet will just run a sightly modified version of the iPhone OS user interface. And you should be quite happy about it, as it's the culmination of a brilliant idea proposed by a slightly nutty visionary genius, who died in 2005 without ever seeing the rise of the JesusPhone.

This guy's name was Jef Raskin.

The incredible morphing computer

Raskin was the human interface expert who lead the Macintosh project until Steve Jobs—the only guy whose gigantic ego rivaled Raskin's—kicked him out. During his time at Apple, Raskin worked on a user interface idea called the "information appliance," a concept that was later bastardized by the Larry Ellisons and Ciscos of this world.

In Raskin's head, an information appliance would be a computing device with one single purpose—like a toaster makes toast, and a microwave oven heats up food. This gadget would be so easy to use that anyone would be able to grab it, and start playing with it right away, without any training whatsoever. It would have the right number of buttons, in the right position, with the right software. In fact, an information appliance—which was always networked—would be so easy to use that it would become invisible to the user, just part of his or her daily life.

Sound familiar? Not yet? Well, now consider this. Later in his life, Raskin realized that, while his idea was good, people couldn't carry around one perfectly designed information appliance for every single task they can think of. Most people were already carrying a phone, a camera, a music player, a GPS and a computer. They weren't going to carry any more gadgets with them.

He saw touch interfaces, however, and realized that maybe, if the buttons and information display were all in the software, he could create a morphing information appliance. Something that could do every single task imaginable perfectly, changing mode according to your objectives. Want to make a call? The whole screen would change to a phone, and buttons will appear to dial or select a contact. Want a music player or a GPS or a guitar tuner or a drawing pad or a camera or a calendar or a sound recorder or whatever task you can come up with? No problem: Just redraw the perfect interface on the screen, specially tailored for any of those tasks. So easy that people would instantly get it.

Now that sounds familiar. It's exactly what the iPhone and other similar devices do. And like Raskin predicted, everyone gets it, which is why Apple's gadget has experienced such a raging success. That's why thousands of applications—which perform very specialized tasks—get downloaded daily.

The impending death of the desktop computer

Back in the '80s, however, this wasn't possible. The computing power wasn't there, and touch technology as we know it didn't even exist.

During those years, Raskin wanted the information appliance concept to be the basis of the Mac but, as we know, the Macintosh evolved into a multiple purpose computer. It was a smart move, the only possible one. It would be able to perform different tasks, and the result was a lot simpler than the command-line based Apple II or IBM PC. It used the desktop metaphor, a desk with folders to organize your documents. That was a level of abstraction that was easier to understand than typing "dir" or "cd" or "cls."

However, the desktop metaphor still required training. It further democratized computing, but despite its ease of use, many people then and today still find computers difficult to use. In fact, now they are even harder to use than before, requiring a longer learning curve because the desktop metaphor user interface is now more complex (and abstract) than ever before. People "in the know" don't appreciate the difficulty of managing Mac OS X or Windows, but watching some of my friends deal with their computers make it painfully obvious: Most people are still baffled with many of the conventions that some of us take for granted. Far from decreasing over time, the obstacles to learning the desktop metaphor user interface have increased.

What's worse, the ramping-up in storage capability and functionality has made the desktop metaphor a blunder more than an advantage: How could we manage the thousands of files that populate our digital lives using folders? Looking at my own folder organization, we can barely, if at all. Apple and Microsoft have tried to tackle this problem with database-driven software like iPhoto or iTunes. Instead of managing thousands of files "by hand," that kind of software turns the computer into an "information appliance," giving an specialized interface to organize your photos or music.

That's still imperfect, however, and—while easier than the navigate-through-a-zillion-folders alternative—we still have to live with conventions that are hard to understand for most people.

The failure of the Windows tablet

As desktop computing evolved and got more convoluted, other things were happening. The Newton came up, drawing from Raskin's information appliance concept. It had a conservative morphing interface, it was touch sensitive, but it ended being the first Personal Digital Assistant and died, killed by His Steveness.

Newton—and later the Palm series—also ran specialized applications, and could be considered the proto-iPhone or the proto-Tablet. But it failed to catch up thanks to a bad start, a monochrome screen, the lack of always-connected capabilities, and its speed. It was too early and the technology wasn't there yet.

When the technology arrived, someone else had a similar idea: Bill Gates thought the world would run on tablets one day, and he wanted them to run Microsoft software. The form may have been right, but the software concept was flawed from the start: He tried to adapt the desktop metaphor to the tablet format.

Instead of creating a completely new interface, closer to Raskin's ideas, Gates adapted Windows to the new format, adding some things here and there, like handwriting recognition, drawing and some gestures—which were pioneered by the Newton itself. That was basically it. The computer was just the same as any other laptop, except that people would be able to control it with a stylus or a single finger.

Microsoft Windows tablets were a failure, and they became a niche device for doctors and nurses. The concept never took off at the consumer level because people didn't see any advantage on using their good old desktop in a tablet format which even was more expensive than regular laptops.

The rise of the iPhone

So why would Apple create a tablet, anyway? The answer is in the iPhone.

While Bill Gates' idea of a tablet was a market failure, it achieved one significant success: It demonstrated that transferring a desktop user interface to a tablet format was a horrible idea, destined to fail. That's why Steve Jobs was never interested. Something very different was needed, and that came in the form of a phone.

The iPhone is the information appliance that Raskin imagined at the end of his life: A morphing machine that could do any task using any specialized interface. Every time you launch an app, the machine transforms into a new device, showing a graphical representation of its interface. There are specialized buttons for taking pictures, and gestures to navigate through them. Want to change a song? Just click the "next" button. There are keys to press phone numbers, and software keyboards to type short messages, chat, email or tweet. The iPhone could take all these personalities, and be successful in all of them.

When it came out, people instantly got this concept. Clicking icons transformed their new gadget into a dozen different gadgets. Then, when the app store appeared, their device was able to morph into an unlimited number of devices, each serving one task.

In this new computing world there were no files or folders, either. Everything was database-driven. The information was there, in the device, or out there, floating in the cloud. You could access it all through all these virtual gadgets, at all times, because the iPhone is always connected.

I bet that Jobs and others at Apple saw the effect this had on the consumer market, and instantly thought: "Hey, this thing changes everything. It is like the new Mac after the Apple II." A new computing paradigm for normal consumers, from Wilson's Mac-and-PC-phobic step-mom to my most computer-illiterate friends. One that could be adopted massively if priced right. A new kind of computer that, like the iPhone, could make all the things that consumers—not professionals, or office people—do with a regular computers a lot easier.

This was the next step after the punching card, the command line, and the graphical desktop metaphor. It actually feels like something Captain Picard would use.

Or, at least, that's how the theory goes.

How will we type on this?

For the tablet revolution to happen, however, the iPhone interface will need to stretch in a few new directions. Perhaps the most important and difficult user interface problem is the keyboard. Quite simply, how will we type on the thing? If you think it's as simple as making the iPhone keyboard bigger, think again. We have already talked about this issue at length, but bears repeating.

The other issues involved are:

• How would Apple and the app developers deal with the increased resolution?
• How would Apple deal with multitasking that, in theory, would be easier with the increased power of a tablet?
• Where would Apple place the home button?

The resolution dilemma

The first question has an easy answer from a marketing and development perspective.

At the marketing level, it would be illogical to waste the power that the sheer number of iPhone/iPod Touch applications give to this platform. Does this mean that the Apple Tablet would run the same applications as the iPhone, just bigger, at full screen?

This is certainly a possibility if the application doesn't contain a version of its user interface specifically tailored for the increased screen real state. It's also the easiest one to implement. The other possibility is that, in the case the application is not ready for the extra pixel space, it may run alongside other applications running at 320 x 240 pixels.

Here is a totally made-up example of home-screen icons and apps running on a tablet at full screen:

However, this would complicate the user interface way too much. My logical guess is that, if the app interface is not Tablet-ready, it would run at full screen. That's the cheapest option for everyone, and it may not even be needed in most cases: If the rumors are true, there will be a gap between the announcement of the device and the actual release. This makes sense, as it will give developers time to scramble to get their apps ready for the new resolution.

Most developers will like to take advantage of the extra pixels that the screen offers, with user interfaces that put more information in one place. But the most important thing is that the JesusTablet-tailored apps represent an opportunity to increase their sales.

From a development point of view, this represents an easily solvable challenge. Are there going to be two applications, one for the iPhone/iPod touch, and another one for the tablet? Most likely, no. If Apple follows the logic of their Mac OS X's resolution-independent application guidelines—issued during the World Wide Developers Conference in June—the most reasonable option could be to pack the two user interfaces and associated art into a single fat application.

How to multitask

Most rumors are pointing at the possibility of multitasking in the tablet (and also on the iPhone OS 4.0). This will bring up the challenge of navigation through running apps that take all over the screen. Palm's Web OS solves this elegantly, but Apple has two good options in their arsenal, all present in Mac OS X.

The app switch bar or a dock
They can implement a simple dock that is always present on the screen or is invoked using a gesture or clicking a button or on a screen icon. This is the simplest available method, and can also be made to be flashy and all eye candy.

Exposé
This is one of those features that people love in Mac OS X, but that only a few discover on their own. Once you get it, you can't live without it. I can imagine a tablet-based Exposé as an application switcher. Make a gesture or click on a corner, and get all running applications to neatly appear in a mosaic, just like Mac OS X does except that they won't have multiple windows. The apps could be updated live, ready to be expanded when you touch one of them. Plenty of opportunity for sci-fi'ish eye candy here.

A gesture makes sense for implementing Exposé on the tablet—as you can do on the MacBook Pro—but they could also use their recently-patented proximity sensing technology. In fact, I love this idea: Make the four corners of the tablet hot, making icons appear every time you get a thumb near a corner. The icons—which could be user customizable—could bring four different functions. One of them would be closing the running application. The other, call Exposé and bring up the mosaic with all running applications. The other could invoke the home screen, with all the applications. And a fourth one, perhaps, could open the general preferences. Or bring a set of Dashboard widgets that will show instant information snippets, like in Mac OS X.

Here's an illustration—again, totally hypothetical—of what this sort of Exposé interface might look like:

The trouble with the home button

The physical home button in the iPhone and the touch plays a fundamental role, and it's one of the key parts of the interface. Simply put, without it, you can exit applications and return to the home screen. On the small iPhone, it makes sense to have it where it is. On this larger format—check its size compared to the iPhone here—things are not so clear.

Would you have a single home button? If yes, would you place it on a corner, where it could be easily pressed by one of your thumbs, as you hold the tablet? On what corner? If you add two home buttons, for easier access, wouldn't that confuse consumers? Or not? And wouldn't placing a button affect the perception of the tablet as an horizontal or vertical device? This, for me, is one of the biggest—and silliest—mysteries of the tablet.

What about if Apple decides not to use a physical button? Like I point out in the idea about Exposé, the physical button could be easily replaced by a user definable hot corner.

Revolution Part Two

With these four key problems solved, whatever extra Apple adds—like extra gestures—is just icing on the iPhone user interface cake that so many consumers find so delicious. The important thing here is that the fabled Apple Tablet won't revolutionize the computing world on its own. It may become what the Mac was to the command-line computers, but the revolution already started with the iPhone.

If Apple has interpreted its indisputable success as an indication about what consumers want for the next computing era, the new device will be more of the same, but better and more capable.

Maybe Apple ignored this experience, and they have created a magical, wondrous, an unproven, completely new interface that nobody can imagine now. You know, the one that will bring universal love, world peace and pancakes for everyone. I'm all for pancakes.

Or perhaps Steve Jobs went nuts, and he decided to emulate el Sr. Gates with a desktop operating system.

The most logical step, however, is to follow the iPhone and the direction set by Raskin years ago. To me, the tablet will be the continuation of the end for classic windowed environment and the desktop metaphor user interface. And good riddance, is all I can say.