USB Gadgets and Why You Need Them in Your Life

RedditPinterestShare/Bookmark


Where would computers be today without the ever present universal serial bus? Thanks to Ajay Bahatt, the Intel engineer who invented the specification, we now have USB devices and gadgets that are self-powered and plug-and-play. Some of these unique little electronic anomalies are useful and some are just down right amusing. If you have ever been sitting at your desk rubbing your chin in an effort to realize information that up to that point had been fleeting, and noticed you have a bit of a beard (happens less often with women) you may need the Syba Rechargeable USB Shaver with Cleaning brush. Just plug it in to your USB port and you’ll be in the midst of proper hygiene in seconds. You may even need to look into your USB powered mirror or use your USB powered toothbrush.

Stay cool in the heat of every moment with the USB powered Multi-colored Fan with Speed control. Oh my, this device works on so many levels and is so handy. Put it on the floor; put it on your desk; put it on your case; it seems to work everywhere. Fresh air is now available via USB.

An item often overlooked during the Holiday season is the USB Christmas tree with multicolor LEDs. Plug this in and you may very well begin singing carols and searching the ceiling for mistletoe. Make the season more jolly using USB.

USB can really help you keep a good grip on reality, too. Consider the USB Pet Rock for instance. Now, this care free little friend will always be there when you need a him/her. It’s the perfect pet and is compatible with all operating systems and consumes no power. You couldn’t ask for a better companion. No hassles here. Just plug it into a USB port and off it goes. Well, it really doesn’t go anywhere, but it’s there when you need it.

Take another look at USB. It is so much more today than it used to be. OutletPC has great USB items. Shop today and be happy tomorrow.

XBox Kinect: It’s not Just for Dancing Anymore!

When the XBox Kinect hit the market last November, the web was inundated with a wave of cute little girls scratching tiger-ears, dancing twenty-somethings, and other well-polished marketing material. The hype was met with modestly successful sales numbers, and quite a few of the devices found their way into homes. In spite of the sales and positive reviews, I still kinda felt like many of you do: that the Kinect was just a gimmick that would be forgotten within the year. However, when Microsoft announced the release of a software development kit (SDK) for the Kinect hardware last week I decided it was time to give the Kinect another look. I was amazed that, once past the Kinectimals and the generic fitness games, I actually found a piece of technology that was quiet impressive.

How it works

The hardware technology behind the Kinect isn’t terribly complicated. Using an IR (infrared) light emitter, the device floods the room in non-visible light and then uses a specialized IR camera to measure the light. This camera is then able to measure the distance between objects in the room based on the intensity of the reflected invisible light. The Kinect also uses a standard web-cam style camera to track color, facial recognition, and some basic planar motion.

Kinect gets more intense on the software side however, which analyzes the 3D images captured by the two cameras and compares them with mountains of data about the ways the human body moves and interacts with its surroundings. This data, compiled from millions of samples collected by Microsoft while they were developing the program, is able to target and follow 42 points on the human body, so it can track your hand as it waves . . . or swings a light saber.

Why it matters

While the technology behind the Kinect may be a knockout in gaming, its potential really goes far beyond that of virtual bowling. Over the next several years, I think we can expect this type of technology to start popping up all over the place, from remote-less TVs (finally no more tearing up the couch cushions) to room lights that turn on or off with a wave of the hand! It also opens new doors for the field of robotics. Check out this video in which a computer learns to visually recognize different objects by “looking” at them through the Kinect hardware:

It may not seem like much at first glace, but this is the technology we’ll need if we ever want those robot-butlers people are always harping on about. Using the Kinect technology, a robot could successfully look through the fridge and identify which can of soda you’re asking for, which shirt you want ironed, or which part of your back you need massaged! Robo-servants aside, this technology could also be used to pilot tanks, aircraft, and, most likely, giant brightly-painted robots:

Imagine conducting a war overseas fought entirely by machines controlled remotely by technology not so different from the what you’re using to play “Dance Paradise!”

These possibilities are the reason why the release of the Kinect SDK is news-worthy. Microsoft is allowing hackers, researchers, and others free access to the techy innards of the Kinect, leaving them free to modify, alter, and expand the uses of the device, thereby sowing seeds for hundreds of new technologies and developments in the future! To play you out, here’s a video of a guy using a hacked Kinect to control Windows 7:

Don’t forget to shop at OutletPC and have a look at our Table of Contents for a complete list of all our fabulous blog-posts to-date!

Five PC Games you Should be Playing

Video game development is a tough business. The games cost millions to develop, produce, market, and distribute, and then, in most cases, really only get a couple months of solid sales ’till their product gets thrown in the discount rack—not to mention pirated off the internet! Because of this, video game developers are too often guilty of cutting corners, rushing deadlines, and generally skimping on the polish that can make games great. Only a rare few have what it takes to make it past those first few weeks, let alone to gain entry into the pantheon of PC gaming perfection.

Here’s a list of five of today’s popular games awesome enough to make the cut, at least for me:

If you haven’t heard of this one, you’ve been living in a hole for the last year, or at least not been paying attention to video games. This is the long awaited follow-up to the original Starcraft, which was released in 1996, the game so addicting that a Korean man played it for forty-hours straight and died. It’s developer, Blizzard, is basically the Pixar of computer game companies since they only release one game every two or three years, and each one is the best game ever. They’re responsible for other famous games like World of Warcraft and Diablo.

Starcraft II is an RTS (Real-Time Strategy) game in which you take command of an army and attempt to kill another army. Users can choose to play as the versatile Terrans (humans), the resilient Protoss (big-headed aliens), or the HOLY-FREAKING-CRAP-THERE-ARE-SO-MANY-OF-THEM Zerg (creepy bug-monsters). While it’s a fairly simple concept, Blizzard has balanced the three factions so meticulously that the game plays more like Chess or Risk than it does like rock-paper-scissors. Its story mode is fun and engaging, with great characters, campy humor, and a compelling plot, but its real treasure is the multiplayer mode. Seconds from launching the game you can be pitting your army against a rival opponent anywhere in the world! The game play is fast, stressful, and wonderfully addicting.

Mass Effect is a game for anyone who loves Starwars, Startrek, or anything else with the word Star in front of it! It’s developed by Bioware, a company previously known for Baldur’s Gate and Starwars: Knights of the Old Republic. Mass Effect is actually a trilogy, the first of which was released in 2008, the second in 2010, and the third is slated to for a release late this year.

Players in this game take command of Commander Shepard, a human hero, who embarks on a galaxy saving quest against a race of sentient machines called “the Reapers.” The story is extremely interesting and is fully voiced by a terrific cast, but what makes this game so remarkable is the degree to which the player can shape the story. Events are set, but the player gets to choose how to respond to these events, causing the story to fork down hundreds of different possible paths. The effect is compounded through the series, since the choices you make in the first game are saved and transferred to the second game and soon the third as well.

The Fallout series is an old one, its roots dating back as far as 1997, but the modern releases, Fallout 3 and Fallout New Vegas, are far and away the series crowning jewels. After laying fallow for years, the franchise was picked up by the RPG giant, Bethesda Softworks, the company responsible for the über-popular Elder Scrolls games.

While the graphics are great, the story is interesting, and the game play is phenomenal, the main tug that keeps me coming back to Fallout 3 is the atmosphere. Bethesda has crafted a world so detailed and convincing that it takes only the lightest suspension of disbelief to feel like you’re there, in the game! Rather than following a set path as in most games, the world of Fallout 3 is completely open, allowing the player to explore and progress the story as he or she wishes. If you’re like me, you’ll spend hours simply exploring, forgetting where you’re “supposed” to be going in favor of figuring out exactly what that gang of super mutants is guarding.

This series is definitely the oldest on this list, and may even be the most successful. The first Civilization games were released in 1991 by Microprose, a long-extinct developer. The mastermind behind the games, Sid Meier went on to develop an army of games with a studio he co-founded, Firaxis games. Over the years, the Civilization series has seen five major installments, as well as a host of spin-offs and expansions.

The most recent version of Civilization, Civ V, was released just last year and shows the benefits of years of polish. Players of will assume leadership of one of history’s civilizations; there are tons to choose from: Roman, American, Mongolian, Aztec, and any others you can think of, and will guide their people through time, from prehistoria to the distant future. As the leader of the civilization you’ll dictate the paths of technological research, the building of national infrastructure, the waging of wars, and the maintenance of diplomatic relations. It’s tough, it’s complicated, and it’s hopelessly addicting! Give this one a try and you’ll soon be skipping showers in favor of “just one more turn . . .”

The Sims 3 is the most recent offering from the best selling PC game series of all time! To compare, The Sims have collectively sold 125 million copies, tying it squarely with Tetris in popularity. While the Sims series is currently in the hands of an EA games subsidiary, The Sims Studio, the series’ roots stem from Maxis, a developer founded in 1987 by gaming legend, Will Wright.

The Sim 3 is like a digital dollhouse; you create a character, build them a home, and they live their lives while you watch. Players can intervene of course, teaching their Sim new tricks like cooking, gardening, writing, and more! They can even marry their Sims off, taking them through the process of meeting, courtship, and, finally, the plunge! If you choose, your Sim can have a baby whom you’ll have to raise and whom you’ll take control of once your original Sim gets old and dies (you’ll have to take them to visit the grave of their deceased parent sometimes, of course.) Addictive, fun, and easily accessible, the Sims 3 is a great game.

Click here to shop OutletPC’s selection of computer games or, checkout my post about the lastest PC gaming technologies here

Posted in Uncategorized
Tagged with

Has USB finally Met its Match? Meet Thunderbolt!

Tech aficionados may have heard a thing or two about Intel’s “Light Peak” over the last couple of years. It was an all-powerful new I/O interface meant as an all-in-one replacement for several of the data connections currently popular in computers—though it hasn’t received much media attention while in development.

That changed today when Intel shot out a press-release not-only presenting their new connector to the world but announcing its immediate availability through Apple’s latest line of Macbook Pro Laptops. Intel also declared a name-change for the technology, dubbing the new connector “Thunderbolt”.

What is it?

Thunderbolt is a data transfer technology, similar to USB, but which is capable of far greater speeds than anything readily available today. This new connector is capable of transferring at fantastically high-speeds, up to 10Gbps. In comparison, the USB 2.0 connector you’ve most likely got on your computer runs at only 480Mpbs, making Thunderbolt roughly 20-times faster than the current popular technology! It accomplishes this by essentially doing for PCI-Express (PCI-Express is an internal high-speed data slot found on your motherboard) what eSATA did with SATA. It provides the same quality connection in an external, Plug-and-Play solution.

In addition to pure data, as with a USB cable or PCI-Bus, Thunderbolt technology is also capable of transferring video and audio signals through an integrated DisplayPort protocol; meaning that Thunderbolt will be able to transfer greater-than-1080p video, 8-channels of audio, as well as data through the same cable. In short, Thunderbolt could effectively replace SATA, PCI-Express, PCI, Firewire, HDMI, DisplayPort, USB, and Audio ports all with equal ease.

What does it mean for you?

Not much right now, it will be several years yet before this new technology becomes common-place. Massive changes such as this are fueled by money, money is generated by demand for a product, and most folks won’t be demanding Thunderbolt until their current gadgets break. As of today there are no products that Apple’s new Thunderbolt enabled laptop can connect to, though we’ll soon see compatible offerings from Apple (likely their next generations of monitors and AppleTVs) as well as from storage and media companies Apogee, Avid Technology, Blackmagic, LaCie, and others.

What does it mean for the industry?

I’m sure Intel sees it as an end to USB, a sentiment I find unlikely. While USB 3.0, the current top-of-the-USB-line is only capable of 5Gbps, half of what Thunderbolt can handle, USB 3.0 is also backwards compatible with USB 2.0—the mainstay connection for hundreds of thousands of products that will still be in use ten-years from now.

What I’m more interested in is the continually blooming buddyship between Apple and Intel. Those two computing giants are getting a little too close for comfort in my opinion and, should they become too intertwined, could strike a big blow to Windows-based platforms.

For now though, let’s just focus on shopping at OutletPC! If you’d like to know more about video connectors such as DisplayPort and HDMI, check out my Idiot’s Guide to Video Cables. If you’d like to read Intel’s official press release on Thunderbolt click here.

The Coffee Pot Webcam

What was the first picture a webcam ever took? Someone’s foot, you may assume, as the technician fumbled to get the cam to power up. But, no. It was the coffee pot in the Trojan Room located in the computer laboratory of the University of Cambridge in ye’ olde England circa 1991. This new wave technology was created to keep hard working computer students from making pointless trips to the coffee room only to find the coffee pot had no coffee. But, even with its modest 128 x 128 grayscale picture, it was still the beginning of a technology which would become important once the Internet and World Wide Web got off the ground.

In 1994, the first commercially viable webcam, the QuickCam by Connectix, entered the marketplace. This original model was only compatible with the Mac via serial port and cost around $100. Jon Garber, the original designer, wanted to call it the Mac-camera but was nixed by the marketing department (surprise) believing it would someday crossover to the PC as well (surprise again). In 1995, it did crossover to be the QuickCam for Windows. The webcam as we love it today was born.

Webcams today are much more feature rich and cost a fraction of the original. High-definition color with software for distorting faces and adding text bubbles is readily available. Noise canceling microphones and audio are also common place. You can call your friends like a video phone (Skype like) and hold video conferences with ease. It’s all there.

If you don’t have a webcam yet, you should get one. They are simple to install and cheap to buy. Get with the program and get videoized today.

By the way, the Coffee Cam at Cambridge was switched off for eternity on August 22, 2001 and the coffee pot (not the original) sold online for 3,350 British pounds. I trust you will go figure.

Is my Power Supply really dead?

So you’re not getting any video and your fans aren’t spinning on? More than likely, there is an issue with your power supply. There are a couple of ways to test a power supply. One way is to use a voltage meter. The other is to use a power supply tester. You can pick one up here. It’s a handy tool to have, especially if you’re a computer technician. The other way to check is to plug in a known good power supply into your motherboard. If your motherboard then boots, that means your power supply is bad.

What form factor PSU do I need?

There are several form factors power supplies come in including ATX, micro ATX, mini ITX, Micro PS3, LPX, SFX, WTX, CFX, and TFX. You can find a wide assortment here. You can use the following information as a guide to choose the correct power supply for your chassis. Approximate dimensions for each are stated below:

ATX: 150 x 140 x 86 (mm)
Micro ATX: 126 x 101x 65 (mm)

Mini ITX/Flex: 160 x 140 x 85 (mm)

Micro PS3: 150 x 101 x 86 (mm)
LPX: 150 x 140 x 86 (mm)

SFX: 100 x 125 x 75 (mm)
WTX: 150 x 230 x 86 (mm)

CFX: 125 x 101 x 69 (mm)

TFX: 171 x 86 x 66 (mm)

How many watts do I need?

The more watts you have, the more devices your power supply can support which means you can add the latest graphics cards, the faster processors, additional hard drives, the works! Take special note to determine if your power supply is rated at true wattage or maximum wattage. A higher wattage does not necessarily mean that your power supply outputs more watts if it is rated at maximum power. True wattage is a more accurate rating of your power supply.

Can my power supply ever have too many watts?

Absolutely not! The power supply only outputs as many watts as is needed to power the installed devices in your computer. For example, let’s say you have a 480 Watt power supply. If the devices in your computer require 180 Watts, then that 480 Watt power supply will only output 180 Watts. The advantage of having the 480 Watt is that it can supply the 180 Watts while not being pushed to its maximum capability. This can lead to a longer lifespan for the PSU.

What’s the best brand?

The better brands may cost more, but there is no one brand that makes the best power supply. For many, it depends on personal preference or their own experiences. Antec, Seasonic, Thermaltake, OCZ, and Corsair are a few of the big players in the higher-end power supply market. You’ll notice that the higher-end power supplies are generally heavier, due the amount and quality of products used in manufacturing the parts. That does not mean that a heavier power supply will always be a better power supply, but generally you will find weight and quality to have a direct correlation to each other. The better power supplies have an efficiency rating of 80 percent or above.


ATI CrossFire and NVIDIA SLI – Brothers of a Different Mother(board)

High-end gaming rigs can get pretty crazy. Expensive, shiny cases with glowing liquid cooling tubes, after-market CPU coolers the size of lawnmower engines, more effects lighting than a Honda Accord from the Fast and the Furious, and, of course, SLI or CrossFire configured video cards.

For the uninitiated, SLI and CrossFire refer to the installation of multiple linked video cards which, when their powers combine, may summon CAPTAIN PLANET! (Or at least render him more quickly).

More specifically, SLI refers to the linking of NVIDIA graphics cards, while CrossFire, or CrossFireX (CrossFireX is what it’s called when there are more than two cards), is restricted to ATI Radeon products.

Despite their different sounding names, the two are fairly similar, with only a few noteworthy differences between them. First, let’s examine the basics of the concept, then we’ll look at the quirks of each, and finally we’ll return and talk about whether or not upgrading to dual graphics cards is a good choice for you.

The Basics

In either setup, CrossFire or SLI, two or more cards are connected via a bridging device and are made to work together, as one card, to more quickly render 3D video. The two cards will assume a master/slave relationship; one card will receive information from the computer and will pass a portion of it along to the slave card, which will then process it and pass it back to the master card, which will then display the information. Because of this, the video ports of the master card will be the only ones able to output any video.

Ideally, the cards will split the workload right now the middle, 50/50. They divide their work in one of three ways, typically defined by the user:

SFR – SFR (Split Frame Rendering) is a process where each frame is horizontally halved and each half is given to one of the two processors. So basically, one GPU will render the images on the top half of the screen and the second would take card of the bottom.

AFR – AFR (Alternate Frame Rendering) is a similar process, except that rather than cutting the frame in two, the cards alternate entire frames; so one GPU may render every even-numbered frame while its companion deals with the odd frames.

SLI AA – SLI AA (SLI Anti-Aliasing) is a bit more complicated a process than are the other two. (For an explanation of how AA works, I’ll pass the buck to this post from Panther Products.) This process is different than the others because its focus is to improve the quality of the image rather than the rendering speed. SLI or CrossFireX cards can handle a great deal more Anti-Aliasing together than a single card could on its own.

SLI

SLI (Scalable Link Interface) as we know it was first released to the public sector in 2004 and has slowly been gaining popularity since, especially among the gaming crowd.

An SLI configured system will have two discrete graphics cards, both of which must use the same GPU (Graphics Processor Unit), e.g. two GTX 580s but not one GTX 580 and one GTX 570; they don’t necessarily have to be from the same manufacturer however. These cards also must be installed on a compatible motherboard which will have two (or more) PCI-Express x16 ports as well as a compatible chipset, either one of NVIDIA’s own nForce or one of Intel’s newer (P55/X58 or later) chipsets—don’t take my word for it though, always check with your motherboard’s manufacturer rather than assuming it does or doesn’t. Linking the two cards is an SLI bridge, a small connector that provides a 1Gbps connection between the two cards, allowing them to function without stealing bandwidth from the PCIe bus. The SLI Bridge should be included in the purchase of any SLI compatible video card.

CrossFire

ATI stepped into the multi-graphics card racket a year later than did NVIDIA, but their offering is no less strong and is actually a little more forgiving. ATI Radeon cards don’t necessarily have to posses identical GPUs to be CrossFired; any two cards from the same family, e.g. HD 5870 and HD 5850 (they’re both from the 5800 family), can be linked together. This makes it easier to upgrade to CrossFire, since you don’t have to buy two or three of the exact same card all at once to ensure proper linkage. To compensate, CrossFired cards are able to more dynamically share the rendering workload. NVIDIA cards typically split the work between them 50/50 while ATI cards can vary the ratio, giving less work to the slower processor and allowing the stronger of the pair to shoulder the greater burden.

Like SLI, CrossFire requires two (or more) PCI-Express ports, two (or more) compatible cards, and a compatible motherboard chipset. Though neither CrossFire nor SLI compatible boards are particularly rare, motherboard chipsets much more commonly favor AMD’s CrossFireX, to the point that, if there are two PCI-Express slots on a board, and it doesn’t specifically advertise SLI compatibility, it’s safe assume it was built with CrossFire in mind.

Is it Worth it?

While it’s good just to know what CrossFire and SLI configurations are, it’s better to know if they’re even actually worth it. The short answer is yes; benchmarks (like these) show tremendous improvement to frame rates in nearly every instance. The longer answer is, not surprisingly, maybe.

Running dual video cards is a significant investment; cards are not cheap, neither are the motherboards which support them, nor the power supplies you’ll need to power them. Before you go running off to grab a second or third video card you’ll want to think about the return on investment you’ll be getting from the upgrade.

Also you’ll need to consider bottlenecks in your system, fifty video cards won’t make your CPU any faster, they won’t give you more RAM, and they can’t reduce access times to your storage drives. In short, you shouldn’t waste money on a multi –GPU setup if the rest of your computer isn’t up to snuff. (check out this article to determine your PC’s snuffiness)

Finally, you’ll want to think about what you’ll be doing with your video cards. Better gaming is the most common use for the cards, but which games are you playing? In a game like Starcraft II, in which there are hundreds of tiny things moving around on the screen at a given time, your CPU does more to determine the FPS than does your video card. This is because the tiny calculations that determine the game play are still handled by the CPU; video cards only do video. If you’re playing Fallout3, however, where there are lots of decrepit buildings to render, your video card does more of the work.

So there it is—everything I know about SLI and CrossFire! Feel free to reward me for my hard work by purchasing a new video card from OutletPC or by gleaning more from my fabulous intellect by reading my blog series why your computer sucks.


Finally, here’s that kid in the Batman costume I’ve been promising you.

USB 3.0: New Speed for You and Me

The ubiquitous USB: the plug-and-play connection we have become accustomed for adding devices without any need for computer prowess or tools is a wonderful technology. And, it is getting better all the time. Getting data through a cable faster and faster is a continual technological challenge and the pipe seems to be getting bigger all the time. Being able to hot swap devices today is taken for granted. It should be. But, it hasn’t always been this easy.

Before 1994, there were (and sometimes still is in select situations) serial cables, parallel cables, ESDI and many other ways of attaching devices to computers. Finally, a group of seven companies got together to figure out a way to connect external devices without having to shut everything down just to make a connection. The Universal Serial Bus was born. We were saved. Unlike older connection standards, the USB connection could also supply power and in some instances didn’t even need a power source. Everything started to look rosy. But, like most things that start out simple, things quickly became complicated.

There were all sorts of connector types and speeds and cables and on and on. So, without boring everyone about all that went on, let’s just get to the basics about what is happening today and what you need to know.

USB 3.0

The current advancement in USB technology is SuperSpeed USB (USB 3.0). This new spec will deliver over 10x the speed (theoretically 4.8Gbits/s) of today’s Hi-Speed USB 2.0 connections. USB 3.0 is still a backward-compatible standard with 2.0 and 1.1 with the same ease-of-use and plug-and play capabilities. Although the USB 3.0 cable is different, it uses the same type of connectors and plugs in the same way.

Six signals are carried on the USB 3.0 cable. Four signals are for a SuperSpeed data path and two are non-SuperSpeed. The USB 3.0 bus will provide 50 percent more power for non-configured devices and a whopping 80 percent more power for configured devices.

Make sure you have USB 3.0 compatible devices and operating system if you plan on benefiting from the new specification. There are many devices that support USB 3.0 on the market today. Motherboards, ExpressCards, controllers, external enclosures and more are available to take advantage of the increased SuperSpeed bus. You can detect the USB 3.0 connections by the color blue inside the connector which indicates it is USB 3.0.

Things keep getting faster and better. If you want the latest power in USB, you may just need USB 3.

Bulldozing the Sandy Bridge – The State of AMD vs. Intel

It only takes a minute shopping for a computer to confirm that the battle between tech-titans AMD and Intel is still alive and well. The war these two companies are waging is not a new one; it’s been going on for more than forty years now, and it’s not likely to end soon. With such tight, unending competition, it’s not a bad idea to check in every now and then, take the pulse of the two companies, investigate their current products, and determine who’s really the better CPU maker: AMD or Intel.

Intel

Let’s begin by taking a look at the brightest and best of what each company has to offer. Intel’s current top-of-the-line is their second generation Core processors, code named “Sandy Bridge” (check out my complete Sandy Bridge article here). Sandy Bridge was officially released in January 2011 and possesses some tremendous upgrades over previous CPUs.

The most remarkable of these upgrades is the on-die graphics controller which has been built-into each of these chips. On-die graphics refers to integrated graphics which are physically located on the processor and run much faster, and with greater power, than motherboard-integrated video, allowing the processor alone to output video quality rivaling that of a discrete graphics card! Speed-wise the second-generation Core CPUs are a step above anything else on the market save Intel’s own enthusiast-level first-generation Core i7 CPUs which will probably continue to rule the Intel roost until the second wave of Sandy Bridge processors hit the shelves late in 2011.

AMD

AMD’s current front runner is the Hexa-Core Phenom II X6 1100T processor, released in mid-December of 2010. This processor boasts six logical cores over Intel’s four, and yet is not nearly as fast as are the upper echelons of Intel’s offerings. The ace up AMD’s sleeve is the up-and-coming line of “Bulldozer” CPUs.

The name Bulldozer refers specifically to a major redesign in the overall architecture of the processor, rather than an improvement to older technology, and will be the biggest change in the company’s product line since the Athlon line was released in 2003. Of this line, the consumer level product (codenamed “Zambezi”) will feature processors with 6-8 cores. In the Bulldozer architecture each processor core will be paired with another, the two will share memory and work as one. The combined cores are referred to as a “Bulldozer Module” (the source of the new line’s name) and there will be 3 or 4 of them on one CPU. Initial impressions of these processors suggest they’ll be up to 50% faster than Intel’s Core i7 950! (Check out the data here)

So who’s Better?

For the moment, it’s Intel. The Sandy Bridge line of processors is extremely fast, the weakest of which being only slightly slower than the best of AMD’s processors, and cheaper to boot. Not to mention that when purchasing a Sandy Bridge CPU, you don’t typically have to drop the cash for a graphics card. However, when Bulldozer hits shelves in April I think we can expect Intel to pass the torch. AMD’s new architecture will revolutionize multi-core processing and wow the PC world collectively.

Remember though, the current Intel line is meant for mid-ranged consumer market, similar to the LGA 1156 processors we’ve seen during the last couple of years. Intel’s true monster isn’t slated for release until Q4 of 2011, the LGA 2011, meant to supersede the current LGA 1366 enthusiast-level processors that are the fastest processors available today.

Have a look at OutletPC’s line of both AMD and Intel Processors Here!

If you’d like to learn more about picking a processor upgrade, check out my post on Choosing a Processor Upgrade

Five Things your Computer Won’t be in Ten Years

I read Popular Science Magazine. My wife got me a subscription for it a couple of years ago for my birthday, and ever since, its arrival, has been one of the highlights of the month (which is sad, I know, but I don’t get out much). As I’ve read the magazine month-to-month though, I’ve never ceased to be surprised by the shear volume of new technologies and ideas that are in the works at any given time. If half of the new stuff being developed right now makes it all the way to a finished, widely distributed, product, it would drastically change the world! Energy would be cheap and abundant, pollution would vanish, and we’d all be able to kick-up our heels while robots take out the trash!

Unfortunately however, most new technologies, never make it to a finished product, especially not on their first go around. The first patented device resembling the modern light bulb, for instance, was created way back in 1841. Other, steadily improving models were patented in 1845, 1851, 1872, and then by Thomas Edison in 1879. However, even with all these inventors working their butts off to bring the incandescent light bulb into the world, electric lighting didn’t become common in American homes until the 1920s, more than 80-years after the first patent was issued!

Computing has followed a similar path, with the first models being invented in the late thirties while the first consumer level computer wasn’t ready until 1974! While it’s true the process of invention-to-product has sped up considerably since the time of the light bulb, it’s also true that many of the cutting edge technologies futurists like to reference (such as: carbon-nanotubing, Ferroelectric RAM, and quantum qubit computing among others), are still much further away from general use than they are closer. So, rather than making rash predictions about the crazy things your computer will have become in ten years, let’s talk about 5-things your computer won’t be in 2021.

  1. Slow – While we’re still miles away from any crazy new processing power sources in the home computer, there are some technologies that will drastically improve the speed of the computer, which will likely be implemented within the next ten-years.
    • Processors, which are currently built from silicone, could run much faster if they were built from diamonds. This may sound ludicrous, but the first diamond processor was built way back in 2003 and it ran at 81GHz! Plus, synthetic diamonds are just as good and cheaper than ever! If diamond won’t serve as a viable replacement, there’s also molybdenite.
    • Solid State hard drives, which are 6-10 times faster than the mechanical hard drives in common use today, are currently more than two dollars per gigabyte, compared to the 10 cents per gigabyte we pay for mechanical. As the technology and acceptance thereof increases, prices will drop and solid state drives will become the norm.
  2. Disconnected – While computer speed will definitely improve, the real computing bottleneck in today’s world is the internet. The fastest internet commonly available today is able to transfer data at a rate of 1000mbps, however this is only possible through a physical Ethernet connection. Wireless computers, such as your laptop, can connect to a wireless signal and, with the fastest technology, surf the web at speeds maxing out at 300Mbps, less than a third of the wired connection. Mobile smart phones, which can access the internet anywhere via a 3G or 4G network, are only able to communicate at a measly 3.1Mbps!

    Since the squeaky wheel gets the grease, it’s a lot more likely we’ll see faster phones and wireless connections than we will faster LAN connections.

  3. Cut-Off – As high-speed internet becomes more and more common, expect your computer to become less and less an island. A common trend that we can expect to continue is cloud-based computing. In your computer, programs are stored on a hard drive from which your processor pulls them when needed. In cloud computing, rather than storing a program on your hard drive, it’s stored on the internet. The benefit to this is that your programs could be accessed anywhere, at any time, without needing to be installed or setup. Cloud-based computing is already in popular use with things like Google Docs or Onlive gaming.
  4. Stuck to a Desk – With the rise of the Smartphone and the tablet, this one seems obvious. However, I don’t see the mobile computer being as much a threat to the desktop computer as it is to the wallet, phone, and car-keys you’re carrying around. As I mentioned before, smart phones already have the ability to control your car and home. Soon, they’ll be able to serve as your identification as well. How long until the Department of Motor Vehicles starts issuing virtual drivers licenses? Ten-years? I think so! (Maybe then you’ll be able to spruce up your license picture with Photoshop!)
  5. Completely off the Desk – It’s been suggested more than once that the desktop computer will soon go the way of the dinosaurs, superseded by smart phones and tablets. While I do expect it to decline in popularity, I wouldn’t expect it to become a rarity any time soon. The desktop computer is bigger, more powerful, and more engaging than a physically smaller device ever could be. PC gaming, graphic design, data-entry, word processing, and other processes will always be easier and better on a desktop computer than they ever could be on a tablet or a phone.

While there are certainly paradigms just waiting to shift between now and 2021, I believe the computers we’ll be using then won’t be too much different from those we’re using now. Faster, more constant connections, more gadget consolidation, high speed processing, and more are all on the horizon. But if you’re waiting for a robot to take out your trash, your house is going to get really smelly in the next ten years.

While you’re waiting, come shop at OutletPC! Our stuff is neat! Also, if you’d like to learn more about how to upgrade the computer you’ve already got, check out my article on the best upgrades for your computer.