Thursday, November 12, 2009
The Intel® Desktop Board DX58SO is designed to unleash the power of the all new Intel® Core™ i7 processors with support for up to eight threads of raw CPU processing power, triple channel DDR3 memory and full support for ATI CrossfireX* and NVIDIA SLI* technology. Today’s PC games like Far Cry 2* need a computing platform that delivers maximum multi-threaded CPU support and eye-popping graphics support.
Designed to unleash the benefits of business software, Intel vPro technology helps IT enhance fleet manageability by combining PCs with Intel vPro technology with top software management solutions from providers like Microsoft, Symantec, LANdesk, HP, and more. And with the exceptional energy-efficient performance of the latest hafnium-based 45nm Intel® Core™ microarchitecture, you'll have a robust foundation for Microsoft Windows Vista* along with future 64-bit and next-generation multithreaded software.
Intel is committed to continue introducing new and innovative process technologies according to Moore's Law, delivering great leaps in performance, new levels of energy efficiency, and lower cost per function to the end user
32nm logic technology
Discover how Intel has demonstrated the world's first 32nm logic process with functional SRAM, packing more than 1.9 billion second generation high-k metal gate transistors.
45nm logic technology
See how Intel has used dramatically new materials to develop its next-generation hafnium-based 45nm high-k metal gate silicon technology.
65nm logic technology
Discover advanced dual- and quad-core capabilities with 65nm silicon technology.
Follow the silicon revolution from the invention of the transistor in 1947 to today's mega performing multi-core processors.
* 60 Years Of the Transistor: 1947-2007 (PDF 216 KB)
Intel’s silicon R&D pipeline
Explore how Intel's silicon R&D pipeline includes 3-D transistors, III-V materials, carbon nanotubes, and semiconductor nanowires for future high-speed and low-power, transistors and interconnects.
As the foundation for Intel's processor-technology, Intel® microarchitecture employs next-generation, 45nm multi-core technology. Optimized to deliver state-of-the-art features that raise the bar on energy-efficient performance, Intel microarchitecture continues to be the catalyst for innovative new designs.
Tuesday, November 3, 2009
The public domain is a large overview term, for any thing, usually a creative work, that can be freely used. For example, the Bible is a public domain work. It can be used, copied, sold, quoted, translated, or altered without infringing on anyone’s copyright or patent privileges. Examples of public domain software include the GNU/Linux software, which forms a part of many PC operating systems.
While you can find lists of public domain software, you are more likely to find lists of free software. In most cases, this software is not really in the public domain. Your obtaining it means that you have acquired a license to use it. If you’ve ever installed a computer program for free, you probably had to accept terms and conditions for using the software. Some of these terms you agree to may prohibit you from selling, altering, or profiting from the software in any way.
There are a number of free software programs that are not public domain software. For example you can easily get copies of Adobe Reader, Netscape, Internet Explorer and a variety of other programs. Also, when you purchase a computer, you may be given several free programs, but again these are licensed to you only, rather than being yours to copy or distribute, as would be public domain software.
You can find public domain software in a variety of locations. The UCLA library offers numerous downloads and catalogues. Some of the most interesting public domain software is key to the sciences. Programs like WebLab and Visual Molecular Dynamics allow you to create three-dimensional drawings of molecules. A great place to look for public domain software, and free or shareware is the Free Software Foundation (FSF). You’ll find lists of public domain software, and any software that grants you automatic licenses to use specific programs. The FSF is also specific in telling you whether you are downloading freeware or public domain software.
There are often references to books and other printed matter that has entered public domain. What this means is that while the printed works were at one time copyrighted and therefore considered the intellectual property of the author and publisher, that is no longer the case. In many cases, the work is no longer in print by the original publisher, the author is deceased, and the copyright was allowed to expire. In effect, the work is no longer owned by a person or entity. When no ownership can be established, the work is considered to be in the public domain.
Along with printed matter, the same general principle applies to early motion pictures that were made before 1922, or were produced by studios that no longer exist. When there is no evidence that someone today is the beneficiary of those works and can reasonably claim ownership of the films, they are considered to be in the public domain. This means that anyone can obtain a copy of the film and reproduce multiple copies for sale without infringing on the rights of anyone.
In addition to works that were once copyrighted but no longer enjoy that status, there are works intentionally created for general public use. Some government documents are an excellent example of this type of public domain product. Unless there are disclaimers to the contrary, government documents are understood to be accessible and usable by everyone, without the need to observe a copyright. Generally, however, it is anticipated that if a section of the document is quoted, the quote will be referenced properly.
As a broad definition, public domain materials are any form of knowledge that is freely available to the general public, and carries no restrictions on the use of the materials. Books, movies, and other forms of printed matter are all common examples of public domain information, but essentially any device that previously enjoyed a copyright but is no longer covered would be considered to be in the public domain.
Thursday, October 29, 2009
The DNS system was devised as Internet use expanded, and researchers realized that having to type in numerical addresses could be challenging for some users. Under this system, which is administered by the Internet Corporation for Assigned Names and Numbers, users can type in words or assortments of characters which are easier to remember than strings of numbers. Each domain includes a top level, second level, and third level or subdomain.
The creation of domain names also allowed businesses to brand themselves more effectively. Instead of telling people interested in Volkswagon cars to go to a specific number address, for example, the company could tell customers to go to volkswagon.com. The DNS system is sometimes compared to the phone book, because it allows people to look up a person, product, or service by name, rather than having to remember the right number.
A web address such as www.wisegeek.com is considered a domain name. The first part of the address, the “www,” is the third level or subdomain.“wisegeek” is the second level, and the “com” is the top level domain. Individual domain owners can create multiple subdomains under their second level domain, such as the subdomain “example” in example.wisegeek.com. Many people refer to second level domains as “domain names” because people often reference sites by the second level domain alone. However, multiple sites could have the same second level and different top level domains, which makes it important to spell out a domain name in full to avoid confusion.
Top level domains such as “com,” “org,” and “net” are generic. Anyone can register domain names in any of these top level domains. Some other generic top level domain options like “edu,” “gov,” and “mil” are restricted to people who can prove that they have a legitimate reason to use them. Top level domains sorted by country code, such as “ie” for Ireland, are also available, as are sponsored domains controlled by various industries.
When someone wants to register a new domain name, he or she has a choice of top level domain, assuming that those domain names are not already taken, and that he or she is authorized to use a particular domain. Some people like to get creative with top level domain names, as in the case of the social bookmarking site Delicious, which registered the domain icio.us so that it could create the subdomain “del” and spell out del.icio.us with its domain name.
When someone wants to register a new domain name, he or she has a choice of top level domain, assuming that those domain names are not already taken, and that he or she is authorized to use a particular domain. Some people like to get creative with top level domain names, as in the case of the social bookmarking site Delicious, which registered the domain icio.us so that it could create the subdomain “del” and spell out del.icio.us with its domain name.
A parked domain includes one simple page. If the site is intended for development, the page will indicate it is under construction or coming soon. Domain parking can be renewed annually, and there is no deadline as to when a site need be developed. You might find you don’t have time, or you might decide to let the domain expire at the end of the contract, typically one year. In this case, the only investment lost is a few dollars.
If you decide to develop your domain, you will need to pay for hosting services at that point. The right hosting service will provide enough space for your website and any special scripts or services you require. Once the domain is being hosted, it is no longer parked.
If you already have a successful site, another use of domain parking is to secure addresses similar to your main website and redirect traffic there - an inexpensive way to protect your website. For example, wiseGEEK.net redirects traffic to the proper site, wiseGEEK.com. The first domain is parked. The parked domain need not reside on the same host server as the main website.
Some people use domain parking for the sole purpose of ‘re-selling’ the address - transferring ownership to a buyer for a fee. This occurred more in the early days of the Internet, when major companies had yet to arrive and were willing to pay a high price for their trademark names. Laws were eventually enacted to protect trademarks, but a parked page can still advertise the sale of a domain.
If interested in domain parking, keep in mind a few considerations. Most domain registrants offer hosting services, but you may or may not want the domain seller to host the domain once it is no longer parked. Be sure to check that the domain seller does not retain any rights to the domain. The buyer should have the power to control the domain’s registration information, and most importantly, the ability to transfer the domain to an independent hosting service when and if desired. Check to see whether a fee is associated with transferring the domain.
The process of multi-homing makes use of what is known as Stream Control Transmission Protocol, or SCTP. Essentially, the process involves employing multi-homing by making use of a single SCTP endpoint to support the connectivity to more than one IP Address. By establishing connection to multiple addresses, multi-homing can help to enhance the overall stability of the connectivity associated with the host.
One of the advantages of multi-homing is that the computer host is somewhat protected from the occurrence of a network failure. With systems that make use of a single IP address and connection, the failure of the connected network means that the connection shuts down, rendering the end system ineffectual as far as connectivity to the Internet is concerned. With multi-homing, the failure of a single network only closes a single open door. All the other doors, or IP addresses associated with the other networks, remain up and functional.
In general, multi-homing is helpful for three elements of effective web management. First, multi-homing can help to distribute the load balance of data transmissions received and sent by the computer host. Second, the redundancy that is inherent to multi-homing means less incidences of downtime due to network failure. Last, multi-homing provides an additional tool to keep network connectivity alive and well in the event of natural disasters or other events that would normally render a host inoperative for an extended period of time.
Multi-homing is often employed in situations where access to the Internet is critical to the operation of a business related effort. For example, multi-homing will be included as part of the disaster recovery initiatives that many financial institutions have in place. By creating network redundancy, it is possible for banks, brokers, and investment firms to remain accessible to customers even when some type of unanticipated event has crippled the primary network interface.
Servers are powerful computers that have extremely large hard drives, or an array of hard drives. Space is then rented to those who want a "website presence" on the Internet.
Every server on the Internet has a unique numerical IP address. You can think of servers as apartment buildings with unique addresses. Each apartment unit within each building is equivocal to space rented out for individual websites. And like real apartment buildings, each unit also has an address based on "the building" in which it is located.
When you rent a space on a server then, you're setting up house on the Internet. You can be reached by a unique address (the website address), which is based on the server's address.
There are many different types of web hosting. Most packages come with certain capabilities for users. Examples are scripts that allow interactive functions, forms, bulletin boards etc. For professional purposes there are also Web hosting services that offer commercial packages that bundle business tools, like point of sales packages and credit card processing.
Prices for Web hosting vary from free to hundreds of dollars a year, depending on your needs. Web hosting for personal websites that don't require any special tools and have low traffic (not a high number of viewings) can be found easily for free.
Free Web hosting is convenient but has its drawbacks. Usually it will be required that you allow the server to run advertisements on your website. The advertisements are normally banner ads -- a banner at the top of the page for example -- and sometimes pop-up ads as well. Most free hosting services offer an alternative pay-plan to have the advertisements removed.
Another consideration is that free services normally allocate your website address as an extension of the server's address. For example: www.thewebhost.com/yourwebsite. If you want an address like: www.yourwebsite.com, you will have to pay to register your own domain name.
The term "Web 2.0" is commonly associated with web applications which facilitate interactive information sharing, interoperability, user-centered design and collaboration on the World Wide Web. Examples of Web 2.0 include web-based communities, hosted services, web applications, social-networking sites, video-sharing sites, wikis, blogs, mashups and folksonomies. A Web 2.0 site allows its users to interact with other users or to change website content, in contrast to non-interactive websites where users are limited to the passive viewing of information that is provided to them.The term is closely associated with Tim O'Reilly because of the O'Reilly Media Web 2.0 conference in 2004.Although the term suggests a new version of the World Wide Web, it does not refer to an update to any technical specifications, but rather to cumulative changes in the ways software developers and end-users use the Web. Whether Web 2.0 is qualitatively different from prior web technologies has been challenged by World Wide Web inventor Tim Berners-Lee who called the term a "piece of jargon"
Monday, October 26, 2009
Intel Matrix Storage Technology provides new levels of protection, performance, and expandability in 2008 for desktop and mobile platforms. Whether using one or multiple hard drives, users can take advantage of enhanced performance and lower power consumption. When using more than one drive the user can have additional protection against data loss in the event of hard drive failure.
Valuable digital memories are protected against a hard drive failure when the system is configured for any one of three fault-tolerant RAID levels: RAID 1, 5 or 10. By seamlessly storing copies of data on one or more additional hard drives, any hard drive can fail without data loss or system downtime. When the failed drive is removed and a replacement hard drive is installed, data fault tolerance is easily restored. In this way, Intel Matrix Storage Technology provides the level of data protection necessary for today's digital computing platforms. In the digital office the increased redundancy reduces costly downtime and maintains employee productivity involving the PC.
Three high school students earned top honors at the Intel International Science and Engineering Fair, a program of Society for Science & the Public, when they each received an Intel Foundation Young Scientist Award and a $50,000 college scholarship.
In addition to these Intel Foundation Young Scientist Award winners, more than 500 Intel ISEF participants received scholarships and prizes for their groundbreaking work. Intel awards included the 18 "Best of Category" winners, selected from the categories, who each received a $5,000 Intel scholarship and an Intel® Centrino® Duo Mobile Technology-based notebook.
A single microprocessor is like a miniature skyscraper with stairway-like circuits between each floor. Hundreds of these "skyscrapers" can be produced on a silicon wafer at a time.
From start to finish, a microprocessor takes about 2 months to produce. Fabrication begins with a very thin slice of silicon. Over 300 manufacturing steps later, this silicon wafer holds hundreds of microprocessors. If you could enlarge the wafer to the size of a swimming pool, the surface would look like a miniature city.
Now think small and ask yourself this: How are such tiny circuits put in such a small chip? Good question. No mechanical object or pen could lay down such incredibly microscopic wires. Instead, the pathways for the current are created by using solvents to remove channels of material. These microscopic channels are then etched with chemicals and implanted with electrons to make them conduct electricity.
Once the areas of the chip have been mapped out by purpose, the circuitry has to be designed down to the individual transistor. With over 500 million of them in modern microprocessors, that's a lot to keep track of. It's like building a city by designing every room in every home and building before you even pick up a brick.
Pluck a hair from your head. (Really.) Now look at it. It isn't very thick, is it? Well, to a microprocessor manufacturer, that hair looks like a telephone pole. That's because a hair is more than 2000 times wider than a transistor on a microprocessor. Wires between transistors are even thinner. They're more than 4000 times thinner than a hair.
How big is a human hair? About 100 microns in diameter. That means a transistor is just 0.045 microns wide.
What's a Micron ? It's a very small metric measurement. You're probably familiar with centimeter marks on a ruler. (If not, go look at one.) A micron is .0001 of a centimeter.
A microprocessor transistor then is 0.0000045 centimeters wide. (Want that in inches? It's 0.00000177 of an inch.)
- Fetch—Microprocessor gets a software instruction from memory telling it what to do with the data.
- Decode—Microprocessor determines what the instruction means.
- Execute—Microprocessor performs the instruction.
Friday, October 16, 2009
First, let's look at how brains and computers work. A brain uses special cells called neurons that work together to process information and respond with an action. A computer uses a collection of circuits called a microprocessor. One is living cells, the other is electronic circuits. So there's a big difference there.
Now let's consider which is smarter. The answer depends on how you define smart. If smart is speed, a computer wins. A person takes a few seconds to add two 3-digit numbers (245+987). A computer can complete several million long-division problems (387÷243) in a single second. A computer is also tireless. The electronic circuits don't wear out. A human doing long division all day would want lots of breaks—and a good night's sleep.
What if smart is having a good memory? In that case, a computer wins too. A computer can store an entire library of books in its memory and recall them without a single mistake. Now consider a person. Have you ever tried just to memorize a long poem? It's an enormous task for a person to memorize a book.
What if being smart is being able to make well-reasoned decisions? Here a person wins by a huge margin. Computers can only calculate and sort information based on the software we design for them. How good their choices are depend on how good the software is. Compare this to a person. Humans don't need software. We can sort and calculate facts using our knowledge and experience. We also can make judgments and decisions based on whatever facts we're confronted with—not just the facts a computer has been programmed to recognize. In this way, we're a lot smarter than computers.
What if you define smart as the ability to think original thoughts? Here again, humans have an enormous advantage. Humans think original thoughts every day. The evidence of these thoughts is in the inventions, art and books all around us. The computer is one such invention. Are computers capable of original thought? So far, they're not. Artificial intelligence is a field of science devoted to developing devices that someday may be able to reason and solve problems. It's important to remember though that no matter how "intelligent" we make computers, they will only be as smart as the software we humans create for them.
Other kinds of output include sound from your computer's speakers and documents printed by your printer. Output can also include things like MP3 files. They allow you to download music from the Internet onto an MP3 player you can take with you anywhere.
A simpler kind of chip is used to make DVD players, remote controls, and electronic calculators. The chips in these devices are embaded processors . They're made to do one thing well and the instructions are coded into them. You can't install new software to change what they do. For example, you can't do word processing on your VCR.
Microprocessors are much more versatile than embedded processors. Change the software you're using and you can go from doing word processing to playing a computer game. Change the software again and you can explore the Internet. Instead of being designed to do one thing, microprocessors are designed to do whatever the software you select instructs them to do.
To process information, computers need to be able to store it. Otherwise, like the phone, information would come and go before anything could be done with it.
Computers store all kinds of information. They store the information you give them, instructions from the software you're using, plus the instructions they need to operate. To store all this, they use two basic kinds of storage. Temporary storage is for information actively being used for processing. Random Access Memory (RAM) accepts new infomation for temporary storage. Long-term storage is for information computers use again and again, such as the instructions the computer prepares itself with every time you turn it on. These instructions are stored in Read Only Memory (ROM), a type of memory that does not accept new information.
Computers also use a variety of devices to store information that isn't actively being used for processing. Hard Drives, Optical Disks, Storage, and Removeable Media.
Sunday, October 4, 2009
Computers are information processing machines. That means that you can use them to access and change information like numbers, text, pictures, and even music. Think of what you can do to modify a single sentence. Using the computer, it's easy to add, delete, or rearrange words. To change a sentence with your computer, though, first you have to get the sentence into your computer.
Input devices are used to put information in your computer. You type a sentence on your keyboard and it goes into the computer. You speak into a microphone and your computer records your words. You make funny faces at the video camera and your computer records every one of them.
Both a toaster and a computer have physical parts you can touch such as the keyboard and mouse. We call these parts hardware.
Here the similiarites between toaster and computer end and the differences begin. Only the computer has something called software that enables it to figure out what to do with the input you give it. You can't touch software. Software gives the computer the ability to process many kinds of information. In contrast, all a toaster can process is bread (and the occasional waffle).
Another difference is a computer has a microprocessor. The microprocessor is the device in the computer that performs most of the tasks we ask the computer to do—from playing computer games to graphing the number of people who prefer cricket to curling. The microprocessor reads and performs different tasks according to the software that instructs it. This ability is what makes the computer such a versatile machine.
The key thing to remember is this: both computer and toaster have four basic components to how they operate (input, storage, processing, and output.) Unlike the toaster, the computer is unlimited in the things it can do.
Babbage's design for his ultimate calculator, the Analytical Engine, was never produced. It did anticipate the four components essential to modern computing. These components are input, storage, processing and output.
The problem with Babbage's and other mechanical calculators was just that—they were mechanical. The moving parts they relied on were slow and subject to breakdown.
What made modern computers possible was the invention of something that could do calculations and other information processing with no moving parts and do it very fast. That something was electronic components. With electronic components, a fast and efficient machine such as Babbage proposed could be built with all four components essential to modern computing.
If you're wondering why Microsoft CEO Steve Ballmer has gone out of his way to badmouth IBM recently, there's a very simple reason: IBM has just released a cloud-based product that takes dead aim at one of Microsoft's cash cows, Exchange.
LotusLive iNotes is cloud-based service for e-mail, calendaring and contact management, and costs $3 per month per user, notes Computerworld. The magazine notes:
IBM is aiming the software at large enterprises that want to migrate an on-premise e-mail system to SaaS (software as a service), particularly for users who aren't tied to a desk, such as retail workers. It is also hoping to win business from smaller companies interested in on-demand software but with concerns about security and service outages, such as those suffered by Gmail in recent months.
In other words, it wants to take Exchange users away from Microsoft, as well as corporate Gmail accounts away from Google.
Ballmer is clearly not pleased, and most likely worried to a certain extent. That would explain his recent bizarre criticism of IBM for exiting the hardware business. As I've written in my blog, since exiting the hardware business IBM has thrived and its stock risen steadily, by 30 percent. In that same time frame, Microsoft's has declined by 30 percent.
SRAMS and the microprocessorThe company's first products were shift register memory and random-access memory integrated circuits, and Intel grew to be a leader in the fiercely competitive DRAM, SRAM, and ROM markets throughout the 1970s. Concurrently, Intel engineers Marcian Hoff, Federico Faggin, Stanley Mazor and Masatoshi Shima invented the first microprocessor. Originally developed for the Japanese company Busicom to replace a number of ASICs in a calculator already produced by Busicom, the Intel 4004 was introduced to the mass market on November 15, 1971, though the microprocessor did not become the core of Intel's business until the mid-1980s. (Note: Intel is usually given credit with Texas Instruments for the almost-simultaneous invention of the microprocessor.)
The Intel Science Talent Search (STS) is America’s most prestigious science research competition for high school seniors. Since 1942, first in partnership with Westinghouse and since 1998 with Intel, Society for Science & the Public, the Washington-based nonprofit dedicated to the advancement of science, has provided a national stage for America’s best and brightest young scientists to present original research to nationally recognized professional scientists.Each spring, 40 finalists are selected from a nationwide pool of thousands to attend the week-long Intel Science Talent Institute in Washington, D.C. There, students have the opportunity to present their research projects to the general public and members of the scientific community at the National Academy of Sciences, meet with distinguished government leaders and participate in a rigorous judging process. Over $1 million is awarded annually to Intel STS participants and their schools. Awards range from $5,000 scholarship grants and laptop computers for all finalists to the grand prize of a $100,000 college scholarship
Saturday, October 3, 2009
Push productivity to new levels while reducing costs by upgrading to desktop PCs with Intel® Core™2 processor with vPro™ technology. Featuring industry-leading multi-core performance¹ along with built-in security and manageability, Intel vPro technology is designed from the ground up to keep downtime to a minimum and productivity at an all time high.²Giving your business the ultimate competitive advantage, new PCs with Intel® Core™2 processor with vPro™ technology can help reduce operating costs and increase user productivity, while enhancing network security. Upgrading can help avoid the escalating software and hardware support costs of older PCs while reducing system downtime. Plus, new power-efficient designs can aid in keeping energy costs low, while built-in security features help combat the expense of security threats.
enlarge image The Intel® Desktop Board DX58SO is designed to unleash the power of the all new Intel® Core™ i7 processors with support for up to eight threads of raw CPU processing power, triple channel DDR3 memory and full support for ATI CrossfireX* and NVIDIA SLI* technology. Today’s PC games like Far Cry 2* need a computing platform that delivers maximum multi-threaded CPU support and eye-popping graphics support.
Thursday, October 1, 2009
1 GB RAM
100 GB hard drive
13.3 " Screen
Microsoft Windows XP Home Edition
Starting at $1849
The slim, lightweight VAIO SZ Notebook delivers an inspiring blend of intelligent mobile design, cutting-edge performance and contemporary style. With a 13.3" widescreen display, long battery life, Intel Centrino Duo mobile technology, revolutionary Hybrid Graphic System and more, you can work faster, play longer and get more from every moment with the VAIO SZ Notebook.
1 GB RAM
80 GB hard drive
17 " Screen
Microsoft Windows Vista Home Premium
Starting at $1399
You can't leave anything up to lady luck during your late night dog fighting sessions or when the rival clan throws down a challenge. The blazing blend of Intel Core 2 Duo processors, NVIDIA SLI Technology and a stunning HD display ensures only your skills will determine the outcome. The "my computer bugged out for a second" excuse won"t work any longer.
2 GB RAM
250 GB hard drive
17 " Screen
Apple MacOS X 10.5
Starting at $2799
The latest Intel processor, a bigger hard drive, plenty of memory, and even more new features all fit inside just one liberating inch. The MacBook Pro has the performance, power, and connectivity of a desktop computer. Without the desk part.
2 GB RAM
64 GB hard drive
13.3 " Screen
Microsoft Windows XP Professional
Starting at $2549
ThinkPad X Series notebooks put the ultra in ultraportable. They're designed for on-the-go professionals who need maximum portability and light weight. And there's no trade-off in usability or durability.
2 GB RAM
160 GB hard drive
12.1 " Screen
Microsoft Windows Vista Business / XP Tablet PC Edition downgrade
Starting at $1537 to $1735
Toshiba's lightweight Portege M750 Tablet PC lets you work in ways and places you never dreamed. This incredibly versatile tablet PC offers the proven performance of a conventional notebook including a keyboard and touchpad with the flexibility of a tablet. So you can use a stylus to tap and draw your way to better productivity, or use the 12.1" screen for the ultimate in computing freedom.
Chipsets Intel® System Controller Hub UL11L, US15L, US15W
Wireless technologies Integrated Wi-Fi and/or WWAN
¹ System performance, battery life, power savings, high-definition quality, video playback, wireless performance, and functionality will vary depending on your specific operating system, hardware, chipset, connection rate, site conditions, and software configurations. Wireless connectivity and some features may require you to purchase additional software, services or external hardware.
² WiMAX connectivity requires a WiMAX-enabled device and subscription to a WiMAX broadband service. WiMAX connectivity may require you to purchase additional software or hardware at extra cost. Availability of WiMAX is limited; check with your service provider for details on availability and network limitations. Broadband performance and results may vary due to environmental factors and other variables. See www.intel.com/go/wimax/ for more information.
Tuesday, September 15, 2009
Once the darling of the booming cellphone business, and still the largest handset maker, Nokia has become a Finn sandwich. I predict its PC and cellphone competitors will surround and slowly eat the market leader.
Nokia's latest products tell the story of its loss of leadership.
The N900 announced late last week is Nokia's attempt at an iPhone killer. In fact, Nokia has already lost the battle for smart phone dominance and even the race to be a runner up to the iPhone.
While the company experimented with Internet tablets, Apple not only launched the most compelling mobile experience for access the Web, it defined the smart phone battle as PC 2.0—having the biggest, most active ecosystem of software developers.
For a year or more Apple's ads have been trumpeting the notion that for any cool idea you might think of it has "an app for that." Behind the scenes of its fight for consumer mindshare, Apple has rallied developers behind a solid OS based on a subset of its well known desktop software. Kleiner Perkins' $100 million venture capital fund for iPhone app startups didn't hurt, either.
By contrast, the N900 is based on Maemo, Nokia's Linux variant which has been rarely by the company despite five generations of development. Developers are getting mixed messages from Nokia about whether they need to support Maemo, the upcoming open source version of Symbian Nokia is developing or Moblin, the mobile Linux variant of Nokia's new design partner, Intel.
No doubt there are some lively meetings in Espoo on this issue these days. Meanwhile, Apple is winning the smart phone battle, and developers are getting the idea that the Google Android OS might morph into their second-best bet, an iPhone like environment for the rest of us. After all HTC and Samsung are already shipping iPhone-like Android handsets.
While Apple has taken leadership of the consumer smart phone, Research in Motion is doing a decent job hanging on to its lead in the corporate world. It has long owned this market with the "Crackberry," the addictive handset for mobile email. It is already selling its Storm and Bold handsets that deliver iPhone-like cachet to its business users while Nokia is still getting the N900 ready for an October release.
Make no mistake, the N900 is a great design. It packs a Texas Instruments OMAP 3430 with the ARM Cortex-A8 processor, up to 1Gbyte of application memory and OpenGL ES 2.0 graphics acceleration. It supports 10/2 HSPA, Wi- Fi and Adobe Flash 9.4. It has 32Gbyte of storage expandable to 48Gbyte and a 5Mpixel camera with Carl Zeiss optics.
One heck of a device—and at an estimated 500 Euros ($710) —one heck of a price. Were it not for the high price, slowness to market, software confusion and wide availability of alternative handsets, it might have been a decent but distant number two to the iPhone.
The service, Motoblur, syncs contacts, posts, messages, photos and other items from multiple sources including email accounts and social networking services like Facebook, Twitter and MySpace, and automatically delivers them to the smart phone's home screen.
"We really believe that Motoblur is going to be a differentiator for us," said Motorola Co-CEO Sanjay Jha, in introducing the product at the GigaOM Mobilize '09 conference here.
Motorola's new 3G handset will be known as Cliq in the United States and Dext elsewhere in the world, Jha said. It will be available in the United States later this quarter exclusively through T-Mobile, and in other parts of the world sometime in Q4.
Cliq features a 3.1-inch HVGA touchscreen display, a 5Mpixel auto focus camera with video capture and playback at 24fps, a 3.5mm headset jack, a music player with pre-loaded Amazon MP3 store application, Shazam, iMeem Mobile, and a pre-installed 2Gbyte microSD memory card with support for up to 32Gbyte of removable memory, according to Motorola.
Back in the game
After a string of successful products including the popular Razr, Motorola has slipped in recent years. The firm now ranks fourth in the world among handset manufacturers, trailing Nokia, Samsung and LG Electronics. In smart phones, Motorola has been outshone by Apple, Palm, Research in Motion, Samsung and others. Jha acknowledged that the company was in need of compelling new phones.
"It finally looks like Motorola is putting out more competitive devices, which for a while has been a problem," said Allan Nogee, a principal analyst at InStat, of the Cliq introduction.
Tina Teng, senior analyst for wireless communications at market research firm iSuppli Corp., said while Motorola has had a pretty successful smart phone available in China, the firm has lagged behind competitors in the North American market. Teng noted that the success of the Cliq depends not only on the performance and capabilities of the device itself, but also the community developing applications for Android.
"For smart phones, the competition is not in the hardware itself, but the applications," Teng said. "I would expect Motorola added on their own applications, which can be totally different from what we have seen in G1."
The G1, made by HTC Corp., was the first Android-based handset introduced in the U.S. last year.
Nogee said Motorola's entry into the Android-based smart phone market is certainly late, and that the company has foundered for a few years, losing significant market share. But he added that cellphone users are pretty fickle and likely to buy one brand of phone one time and another the next.
Sony's 3D compatible BRAVIA LCD TVs incorporate frame sequential display and active-shutter glass systems, together with its proprietary high frame rate technology to enable the reproduction of full high definition high-quality 3D images, and will form the centerpiece of Sony's 3D entertainment experience for the home.
In addition to 3D compatible BRAVIA LCD TVs, Sony will also develop 3D compatibility into many more of its devices, such as Blu-ray disc products, VAIO and PlayStation3, to provide a multitude of ways in which 3D content—from 3D movies to stereoscopic 3D games—can be enjoyed in the home.
In the growing industry of 3D cinema, Sony has supported and driven the expansion of 3D by providing a wide variety of professional equipment for the shooting, production and screening of movies in 3D. The number of digital 3D screens is increasing rapidly, and is expected to reach 7,000 by the end of 2009. In addition to 3D movies, Sony's range of professional 3D products is also driving the growth of 3D production and distribution across a range of entertainment industries, from theater and music performances to sport and beyond.
Embracing the "make.believe" (make dot believe) philosophy, which signifies the company's ability to turn ideas into reality, Sony will strive to further enhance synergies across its group companies. Sony will leverage its wealth of technology and engineering resources spanning both professional and consumer markets to bring the optimum 3D viewing experience to the home, from 2010 and beyond.
Intel Corp. announced its latest entry into the solid-state drive market with the Intel Z-P140 PATA solid-state drive designed for handheld mobile devices. Smaller than a penny and weighing less than a drop of water, these 2Gbyte and 4Gbyte ultrasmall devices are fast, low-power and rugged, with the right size, capacity and performance for mobile Internet devices, digital entertainment and embedded products, said Intel.
SSDs use flash memory to store operating systems and computing data, emulating hard drives. The Intel Z-P140 PATA SSD has an industry standard parallel-ATA (PATA) interface and is optimized to enhance Intel-based computers, and will be an optional part of Intel's Menlow platform for mobile Internet devices debuting in 2008.
The Intel Z-P140 is the smallest SSD in its class, making it attractive to designers and manufacturers of mobile and ultra-mobile devices. Comparatively, the Intel Z-P140 is 400 times smaller in volume than a 1.8-inch HDD, and at 0.6g is 75x lighter. It is also a much more durable alternative to HDDs.
The 2Gbyte and 4Gbyte capacities are large enough to store mobile operating systems, applications and data such as music or photos. It is extendable to 16Gbyte for added storage capacity.
"Our mission is to provide world-class non-volatile SSD and caching solutions that are designed, optimized and validated to enhance Intel Architecture-based computing platforms," said Pete Hazen, director of marketing for Intel's NAND products group. "Our customers are finding the Intel Z-P140 PATA SSD to be the right size, fit and performance for their pocketable designs. This is Intel's latest offering as we continue to expand our product line of reliable, feature-rich and high-performing SSDs."
The Intel Z-P140 PATA SSD offers read speeds of 40MBps and write speeds of 30MBps. Critical to mobile applications, its active power usage is 300mW, and only 1.1mW in sleep mode, which helps to extend a device's battery life.
With a 2.5 million hours MTBF rate, this PATA-based chip scale package delivers reliable solid-state performance in an extremely tiny footprint. The Intel Z-P140 is currently sampling with mass production scheduled in the first quarter of 2008. The 4Gbyte version will follow the 2Gbyte product.
The Intel Z-P140 PATA SSD adds to the current Intel Z-U130 USB Solid-State Drive family introduced last March. The Intel Z-U130 USB Solid-State Drive has a USB standard interface and is used as a faster storage alternative for a variety of Intel-based computing platforms such as servers, emerging market notebooks and low-cost PCs as well as embedded solutions.
Intel has also demonstrated technology for future high-performance SSDs with a serial ATA interface that will round out the full family of Intel SSD offerings. The technology will be announced as a product line in 2008.
Intel is expected to disclose June 6 its launch plans surrounding its next-generation Core 2 microprocessors, as well as a new low-voltage Core Duo chip that will be featured in thin-and-light notebooks from Dell and Hewlett-Packard.
Of longer-term importance, however, will be the introduction of the Intel P965, or "Broadwater," chip set, which marks the end of older parallel ATA disk drives and IDE storage within the PC.
The introductions are scheduled to be made in a speech delivered by Anand Chandrasekher, Intel's senior vice president and general manager of its marketing group, at the Computex show in Taipei.
While Intel's Core chips, including its Core Duo, continue to outsell Advanced Micro Devices' own Athlon64 and Athlon64 X2 components, AMD's Athlon line has traditionally outperformed Intel's chips. However,
Intel plans to begin shipping server, desktop and laptop processors based on the company's new Core microarchitecture in June, July and August, respectively, Chandrasekher is expected to announce.
By 2013, analysts predict that one in every three mobile phones sold will be a smartphone. Entertainment-based mobile devices are also on the rise with the digital still camera (DSC) household penetration rate expected to reach 80% by 2010 in the United States alone.
Wireless technologies such as Bluetooth, Ultra-wideband (UWB) and WLAN, all make this wireless data transfer viable today, yet each has its own obstacles which makes its use in the mobile market challenging. Additionally, due to the inherent nature of each of these technologies, some may be better suited than others in applications requiring, for example, a high bandwidth or a wide operating range.
In the case of the mobile market, one of the main requirements is long battery life. Therefore, one key factor that must be evaluated when determining which technology to implement in a mobile device design is power consumption.
Throughput is also an important factor as it contributes to the technology's overall power consumption. Choosing the right wireless technology is crucial to developing an optimal, commercially-viable mobile device.
Understanding power consumption
At the end of the trip, the consumer returns home only to find that the camera has a mere 5% of its battery power left, making it impossible to transfer the camera's content to another device, such as a desktop computer.
During typical camera usage, this is exactly the type of capability that today's consumer demands.
Given this scenario, it is easy to see why low power consumption and high throughput are so critical. Standby power and active power consumption (e.g., in a cellular phone the power attributed to talk time, or in this case, attributed to data transfer) must therefore be reduced in order to increase battery life.
Consequently, today's designers require implementation of a wireless technology which supports multiple low power modes, regardless of whether the device's active power is on or it is in standby mode.Throughput (e.g., power per Mbit of data) also plays a critical role here because even if a protocol's active power is high, it may ultimately exhibit the lowest total power energy consumption, and therefore provide the best power efficiency, if it transfers data extremely fast.
Microsoft is streamlining its Mac Office offerings by introducing a Business Edition and getting rid of two other editions of Office 2008. The company now offers Office 2008 for Mac Business Edition and Office 2008 for Mac Home & Student, with the business version going for $399.95 (or $239.95 for an upgrade).
The Mac Business Unit (Mac BU) announced the change on its blog today, saying that it was made to help simplify offerings and make it easier for customers to decide what's best for them. Previously, the Mac BU offered three versions of Office 2008: Office 2008 for Mac, Office 2008 for Mac Special Media Edition, and Office 2008 for Mac Home and Student Edition. Needless to say, by the titles alone, those products are hard to differentiate—how do I know whether I just want regular Office 2008 for Mac, or Home and Student edition? What's the deal with Special Media Edition?
According to the Mac BU, the Business Edition of Office 2008 has all the standard applications (Word, Excel, PowerPoint, Entourage, all SP2), as well as "new tools to provide a more complete productivity package for the enterprise." Those include Entourage 2008 Web Services Edition, Document Connection for Mac, and numerous Extras (such as templates, clip art, and lynda.com training). The blog post doesn't specifically talk about the Home & Student edition, so we assume it's the same package that Microsoft offered previously—it contains all the main applications, minus the Exchange and Automator support that comes with the regular version of Entourage for $149.95.
Though we're sure there are some users who will mourn the passing of the Special Media edition—a version that came with Microsoft Expression Media, a digital asset management suite that let users users import, annotate, organize, archive, search, and distribute their media files. However, it's clear that its inclusion in the lineup was just confusing users, and likely wasn't selling all that well at $499.95 anyway. With just two clearly named options, it's much easier for users to decide what version they need and get on with the productivity.
Apple announced Tuesday that former SVP and general counsel of Intel, Bruce Sewell, will be replacing the retiring Daniel Cooperman as the head of Apple's legal department. Intel had announced that Sewell had decided "leave the company to pursue other opportunities" yesterday, amid the announcement that CTO Pat Gelsinger was making a similar move. Sewell will become the senior vice president of Legal and Government Affairs as well as general counsel beginning next month.
Sewell served as Intel's general counsel since 2001 and had served on the company's legal team since 1995. Most recently, he has been working on an appeal for the huge fine levied by the European Union for anticompetitive practices. Prior to working for Intel, he was a partner with Brown and Bain PC, which Fortune notes was the law firm that represented Apple in its "look and feel" case against Microsoft over copying of the Mac's GUI.
Sewell is Apple's third general counsel in as many years, ever since former general counsel Nancy Hienen was let go. Hienen left after nine years with the company following an internal Apple investigation into backdating of stock option grants in 2006. Her replacement, Donald Rosenberg, left Apple to join Qualcomm after less than a year. Steve Jobs then hired Daniel Cooperman away from Oracle to replace Rosenberg in late 2007.
At Apple, Cooperman oversaw all of Apple's legal business, including worldwide legal policies, corporate governance, securities compliance, commercial licensing, intellectual property, employment law, litigation, patent law, mergers and acquisitions and legal support for Apple’s various business units. He also managed Apple's Government Affairs and Global Security groups. Sewell will certainly have his work cut out for him, but his experience at Intel should have him well-prepared for working at Apple. "With Bruce's extensive experience in litigation, securities and intellectual property, we expect this to be a seamless transition," said Jobs in a statement.
Moorestown's massive reduction in idle power draw is accomplished using the same basic technology, power gating, that Intel used to reduce Nehalem's idle power. Power gating lets Intel address the problem of leakage current, which I've gone into some detail on in a previous post. And, also like Nehalem, Intel has divide up the Lincroft SoC into different power and clock regions that can be downclocked or turned off independently of one another.
Also included is an increased number of clockspeed levels at which parts of the Lincroft SoC can operate. The idea here, as in all dynamic power optimization schemes, is to dynamically scale frequencies to match the workload. By adding more granular frequency scaling options for Lincroft's different functional blocks, the part can more closely fit its performance profile to a workload's needs within a given timeslice.
The main problem with doing this kind of dynamic frequency scaling aggressively in normal server or desktop computing applications is that there's always some latency involved in these power state transitions, and that latency saps performance. (In other words, all of the frequency scaling potential in the world is no good if the chip takes too long to react to and adapt to real-time changes in a workload.) But my guess is that this latency/performance issue is less critical in mobile applications for a variety of reasons (limited multitasking, relatively simple applications, low OS overhead, etc.), so Intel can just go nuts with the number of power states.
Intel has also extended this dynamic frequency scaling to Moorestown's memory bus, so that the platform can scale memory latency and bandwidth (and bus power draw) to match the current workload.
Moorestown also implements hyperthreading to boost performance-per-watt, but I really can't see this doing anything for a smartphone. This is a feature that will help Moorestown in netbooks, if the platform finds a use in that vertical.