Pages

Welcome !!!

Hi,
Welcome to the Tech-Info world where you will be provided with the latest Tech-Updates and much more !!!

Just have a look !!!

Monday, February 28, 2011

"How Virtual Computing Works"

"How Virtual Computing Works"


Hp Touchsmart Pc Core I3 And Core I5The latest trend of HP products look brilliant. The highlight products are the transformed TouchSMart PC's, the TouchSmart 300 and the TouchSmart 600, now built with the Core i3 and Core i5. Also included is a "Beats Audio" program, and a range of pre-loaded software. The Core i3 and the Core i5 are priced at $799 and $1,099, respectively.

Also, HP have revealed their new All-In-One 200-5020 machine, priced at $699, exhibiting a 21.5 inch 1080p touch screen, WIndows 7 bundled, WiFi enabled, optional mouse / keyboard, DVD burner, integral webcam and MediaSmart software package.

Thirdly, there's the HP Compaq Pro Ultra Slim. Measuring in at 10 inches high and 2.6 inches wide, it includes ATI's Raedon 4200 GPU and will retail at $599.

All of these new HP product releases should be available for purchase by the end of this month.

Tuesday, February 22, 2011

Lenovo launches new, improved ThinkPads


The ThinkPad line of laptops has long been a mainstay in the business world, and if Lenovo has anything to do with it, the newest models in its ThinkPad L, T, and W series will continue the tradition. The new laptops—the T420s, T420, T520, L420, L520 and W520—add new video and voice calling functionality, better power management, and new features for better IT management.
Lenovo says that the new laptops incorporate what it calls "self-aware and adaptive technologies" that it says will be able to optimize the capability of the new laptops to power down components not being used for the task at hand, thereby helping conserve battery life. This apparently translates to 30 percent more battery life. In addition, the company says its new thermal design allows the laptops to run at a higher-clock speed for longer periods of time.
The new ThinkPads also add features to improve their video and voice-calling functionality. This includes the ability of the integrated dual-array mics to go into private chat mode (essentially going uni-directional) or conference mode (which creates a 360-degree listening area), as well as new keyboard noise suppression technology.
A boon for IT managers will be the fact that the new models will share the same docking station, as well as common batteries. The three lines will also have continuous wireless connection during sleep mode for up to 99 minutes, which means the user's wireless connectivity will stay live even with the laptop's lid shut.
The 14-inch L420 and 15.6-inch L520 are the newest additions to Lenovo's entry-level business laptop line. Both models can be configured with up to a 2.7GHz Intel Core i7-2620M CPU and a maximum of 8GB of DDR3 memory. Both also sport 4 USB 2.0 ports, 1 eSATA port, a 54/34mm ExpressCard slot, a multimedia card reader, and Displayport.
The new models in the ThinkPad T series of performance laptops are the T420s, T420, and T520. The T420 can be configured with up to a 2.7GHz Intel Core i7-2620M CPU, and comes with 2 USB 2.0 ports and 1 comboUSB 2.0/eSATA port. Meanwhile, the T520 can be configured with up to a 2.3GHz Intel Core i7-2820QM processor and has 3 USB 2.0 ports (the configuration with a discrete graphics card will have 2 USB 2.0 ports and 1 USB/eSATA combo port). For improved audio sound, the ThinkPad T420s is the first business-class laptop to offer Dolby Home Theater v4, Dolby's latest audio technology with virtualized surround sound and dialog enhancement technology.
The ThinkPad W series of mobile workstations adds the 15.6-inch W520 to its stable. It can be configured with up to a 2.5GHz Intel® Core i7-2920XM extreme processor, up to 32GB of DDR3 memory, and either the Nvidia Quadro 1000M graphics card (with 96 CUDA cores) and Nvidia Quadro 2000M graphics card (with 192 CUDA cores) with Nvidia's Optimus technology. The W520 comes with ISV Certification for DSS CATIA, SolidWorks, Autodesk Inventor, AutoCADAdobe, and Maya.

grid computing


Grid computing (or the use of a computational grid) is applying the resources of many computers in a network to a single problem at the same time - usually to a scientific or technical problem that requires a great number of computer processing cycles or access to large amounts of data. A well-known example of grid computing in the public domain is the ongoing SETI (Search for Extraterrestrial Intelligence) @Home project in which thousands of people are sharing the unused processor cycles of their PCs in the vast search for signs of "rational" signals from outer space. According to John Patrick, IBM's vice-president for Internet strategies, "the next big thing will be grid computing."
Grid computing requires the use of software that can divide and farm out pieces of a program to as many as several thousand computers. Grid computing can be thought of as distributed and large-scale cluster computing and as a form of network-distributed parallel processing. It can be confined to the network of computer workstations within a corporation or it can be a public collaboration (in which case it is also sometimes known as a form of peer-to-peer computing).
A number of corporations, professional groups, university consortiums, and other groups have developed or are developing frameworks and software for managing grid computing projects. The European Community (EU) is sponsoring a project for a grid for high-energy physics, earth observation, and biology applications. In the United States, the National Technology Grid is prototyping a computational grid for infrastructure and an access grid for people. Sun Microsystems offers Grid Engine software. Described as a distributed resource management (DRM) tool, Grid Engine allows engineers at companies like Sony and Synopsys to pool the computer cycles on up to 80 workstations at a time. (At this scale, grid computing can be seen as a more extreme case of load balancing.)
Grid computing appears to be a promising trend for three reasons: (1) its ability to make more cost-effective use of a given amount of computer resources, (2) as a way to solve problems that can't be approached without an enormous amount of computing power, and (3) because it suggests that the resources of many computers can be cooperatively and perhaps synergistically harnessed and managed as a collaboration toward a common objective. In some grid computing systems, the computers may collaborate rather than being directed by one managing computer. One likely area for the use of grid computing will bepervasive computing applications - those in which computers pervade our environment without our necessary awareness.

There's a Right Way to Reach the Open Cloud


The Cloud's Open Source Roots
Open source became one of the core foundations of cloud computing as early cloud pioneers used the freely available, freely distributable model to power their Web-scale deployments -- achieving an unprecedented level of scale at a bare-bones cost. Amazon (Nasdaq: AMZN) built its cloud to be a business, making it possible to easily recoup its investment by charging other companies usage fees.
Most other companies, though, are not building a cloud to resell. To enterprises today, the attraction of open source is about the ability to develop a more flexible infrastructure and avoid vendor lock-in that often results from proprietary systems.
There's a Right Way to Reach the Open CloudThey want to use open technologies to get their cloud infrastructure up and running at the lowest cost and in the fastest way possible. At the same time, they gain the flexibility of customization that open source offers; they can build whatever they want.
When planning a cloud infrastructure, consider whether you want to build it yourself or choose to partner with an open source vendor that has completed 80 to 90 percent of the work, leaving you to do only 10 to 20 percent to customize it for your company.

The Case for Open Source

The cloud comprises commoditized open source components, which many people -- myself included -- believe is the only way possible to achieve the true economics of the cloud.
Consider Amazon, a huge enterprise Enterprise Payment Security 2.0 Whitepaper from CyberSource cloud provider that is using open source Xen... with open source Hadoop... with open sourceChef... with open source insert-component-name-here.
In fact, quite a few large-scale enterprise vendors are developing solutions termed "cloud," including Google (Nasdaq: GOOG), Yahoo(Nasdaq: YHOO) and Salesforce.com (NYSE: CRM) among others. How are they managing to do this? By using open source software on commodity hardware and scaling it across everything they do.
A company that tries to implement a cloud infrastructure in a closed-source manner is already going against the grain, adopting an approach contrary to that of the companies that built the cloud. If you consider doing things the closed-source way, you should prepare yourself to buy more expensive hardware, experience vendor lock-in, and perhaps be surprised by unexpected costs.

Considerations on Implementation

There is a broad spectrum of cloud solutions, each of which offers a different scale of finish and polish, whether your company needs to integrate the cloud with its legacy enterprise network elements, or one that supports the rules and regulations around compliance.
As you evaluate enterprise-grade cloud vendors and solutions, be sure to consider the following factors/questions:
  • How flexible is the technology?
  • What are you trying to accomplish with the cloud -- i.e., where are the optimal workloads in your organization that would benefit the most from a cloud infrastructure?
  • Is it truly enterprise-grade? Does it support integration or partnerships?
  • How much of the solution is complete?
  • Does the vendor have referenceable implementations? If so, do they match the scale of your planned deployment?
  • Some vendors reference results that are based on a lab setup of five to 10 computers, which is not too challenging. Testing on tens of thousands of computers is a different story altogether.
  • How much support will the vendor provide?

The success of the deployment will ultimately define the success of the open source project. With Linux, the vendors that had the most enterprise users did so because they achieved the right level of solution and innovation for the enterprise.
Enterprises typically need the best support, code, interoperability, technology and so forth. Success will also depend on a company determining where the cloud makes the most sense in the organization and where it can provide maximum benefit to the business.

The Clout of Community

By their nature, open source technologies encourage collaboration and community creation. Community member companies dedicate themselves to open standards and innovation, resulting in greater choice and flexibility to consumers of their technologies.
For example, the communities forming around Linux distributions like Red Hat (NYSE: RHT), Ubuntu and CentOS, as well as virtualization technologies like KVM and Xen, are driving forces in the establishment of cloud computing standards. OpenStack is one of the largest open source projects on the planet.
For service providers, this means a variety of software and services are available to customize and deploy their offerings. For enterprises, it means they are able to quickly and easily migrate solutions to the vendors that provide the best service and support for their specific needs.
When considering vendors for your cloud deployment, consider which vendor was the first to gain users and get to the community. The winner will be the company able to use community engagement and turn it into a viable commercial product. The way to determine this is by observing which vendor succeeds in gaining customers.
As customers start to engage and pay for open source technologies, platforms become defined. As platforms become defined, developers want to participate in the most active project community, and they contribute their intelligence and coding skills to further develop that platform.
Customers are drawn to the project by its constant improvements and evolution. It becomes a cycle that feeds the success of the open source project and platform: Developers bring customers, and customers bring more developers.
As your company begins to enter the cloud, you will want to have vendors that are actively engaged in their community to help drive the interoperability and flexibility you require.

Sunday, February 20, 2011

Need a supercomputer? This guy builds them himself


Network World - Bruce Allen is perhaps the world's best do-it-yourselfer. When he needed a supercomputer to crunch the results of gravitational-wave research, he built one with his colleagues at the University of Wisconsin-Milwaukee.
That was in 1998, and since then he's built three more supercomputers, all in pursuit of actually observing gravitational waves, which theoretically emanate from black holes orbiting each other and from exploding stars but have never been directly observed.
His most recent supercomputer, a cluster of 1,680 machines with four cores each, is in Hanover, Germany. Essentially, it's a 6,720-core processor that in the months after it was built was ranked No. 58 in the world. "We filled our last row of racks recently, and we're No. 79 on the current Top500 list," says Allen, director of the Max Planck Institute.
He builds his own for several reasons, one of which is that he thinks he gets more for his money when he does the work himself.

Just 3 Indian supercomputers in global Top 500 list

New ratings see both ‘ Made in India’ machines in list – at lower ranks
AMD fuels world’s fastest computer – at 1.75 petaflops
The latest half yearly ratings of the world’s 500 fastest computers finds the number of India-based machines shrink from six to three since the last ratings in June. The two made-in-India platforms are still in the list, but the machines used by private commercial agencies is down to just one at an unnamed location.
Tata’s Eka supercomputer in Pune’s Computational Research Laboratories (CRL) remains the fastest Indian supercomputer, its top performance of 132.8 teraflops remaining unchanged. However shifting goalposts in high performance computing technology sees the Eka slip from no 18 to no 26 in the list.
The government-run Centre For Development of Advanced Computing (C-DAC) also headquartered in Pune, sees its Param Yuva machine at no. 137 in the latest ratings announced November 17. Its speed is in fact a bit faster than before -- 38.1 teraflops -- but its rank is now 137, down from 109 in June.
Both platforms are clusters, indigenously assembled, using Intel Xeon chip- fuelled nodes sourced from HP
The only  other India based supercomputer is a 28.357 teraflop cluster also with HP hardware, and ranked no. 247. 

China unveils 2.5-petaflop supercomputer


IDG News Service - China is unveiling a new supercomputer on Thursday that incorporates thousands of graphics chips and can reach a sustained performance of 2.5 petaflops, making it one of the fastest systems in the world.
Located at the National Supercomputing Center in Tianjin, the Tianhe-1A supercomputer has 7,168 Nvidia Tesla M2050 GPUs (graphics processing units), each with 448 processor cores. It also has 14,336 six-core Intel Xeon CPUs.
The supercomputer was built by China's National University of Defense Technology and is "the fastest system in China and in the world today," Nvidia claimed in a press release.
Besides a sustained performance of 2.5 petaflops measured by the Linpack benchmark, it has a theoretical performance of 4.669 petaflops when all the GPUs are operational, according to an Nvidia spokesman. The benchmarks were provided by the National Supercomputing Center in China, he said.
China has been moving up the supercomputing ranks in recent years. The last Top500 list of the fastest supercomputers, issued in June, lists the Nebulae supercomputer in Shenzhen as the world's second fastest. That system also combines Intel CPUs and Nvidia GPUs.
The fastest supercomputer in the world according to the June list is the Jaguar system at the U.S. Department of Energy's Oak Ridge National Laboratory, which can deliver 1.76 petaflops of sustained performance.
The Tianhe-1A was announced two weeks before the release of the next Top500 list, so it is too early to say if it will be the fastest system on the list.
China is looking to boost its computing resources and is doing a lot with hybrid supercomputer designs, said Nathan Brookwood, principal analyst at Insight 64.
A number of supercomputers have combined GPUs with CPUs to boost performance. GPUs are specialized co-processors that are faster than traditional CPUs at executing certain tasks, like those used in scientific and mathematical applications.

Saturday, February 19, 2011

IBM's Watson could usher in new era of medicine


Computerworld - The game-show-playing supercomputer Watson is expected to do much more than make a name for itself on Jeopardy.
IBM's computer could very well herald a whole new era in medicine.
That's the vision of IBM engineers and Dr. Eliot Siegel, professor and vice chairman of the University of Maryland School of Medicine's department of diagnostic radiology.
Siegel and his colleagues at the University of Maryland and at Columbia University Medical Center are working with IBM engineers to figure out the best ways for Watson to work hand in hand with physicians and medical specialists.
Siegel, who refers to the computer not as the champ of Jeopardy but as "Dr. Watson," says he expects the computer, which can respond to questions with answers rather than with data and spread sheets, to radically improve doctors' care of their patients.
"There is a major challenge in medicine today," Siegel told Computerworld. "There's an incredible amount of information in a patient's medical record. It's in the form of abbreviations and short text. There's a tremendous amount of redundancy, and a lot of it is written in a free-form fashion like a blog or text.
"As a physician or radiologist, it might take me 10 or 20 or 60 minutes or more just to understand what's in a patient's medical record," he said.
Within a year, Siegel hopes that "Dr. Watson" will change all of that. Watson is expected to be able to take a patient's electronic medical records, digest them, summarize them for the doctor and point out any causes for concern, highlighting anything abnormal and warning about potential drug interactions.
"It offers the potential to usher in a whole new generation of medicine," Siegel said. "If all Dr. Watson did was allow me to organize electronic medical records and bring to my attention what's most important and summarize it, that would be incredibly valuable to me.
"Even small things that Watson can do will change the way I, and my colleagues, practice medicine," he said.

The ability to deliver a single, precise answer from these documents could go a long way in transforming the healthcare industry. Watson, the IBM computing system designed to play Jeopardy!, could deliver such a solution.
Richard F. Doherty, research director at analysis firm Envisioneering Group, said he's excited to have a computer organize his medical history for his physician.
"That sounds excellent," Doherty said. "I think we've all been through the situation of filling out forms for new doctors, and then they don't have the time to read through it all, and they just say, 'What? You have a sore throat?' Having Watson help attend to our needs sounds like a great application of [the computer]."
But organizing and summarizing patient histories isn't all Watson is expected to do.
Siegel, who also works with the National Cancer Institute, said he's hoping that Watson will also be able to take patient and treatment information from hundreds, if not thousands, of hospitals and pull it all together.
Then, when a doctor is considering treating a patient with a particular drug or treatment, he first can ask Watson how that treatment has worked on patients with similar diagnoses and backgrounds.
"Watson can ingest information efficiently and rapidly," Siegel said. "It'll have an encyclopedic knowledge and suggest diagnostic and therapeutic possibilities based on databases much larger than one physician can possibly hold in his head.
"This technology brings a potential to have a renaissance of medical diagnosis," he added. "It offers the potential for us in the next five or 10 years to routinely deploy computers when working with our patients."
Jennifer Chu-Carroll, an IBM researcher on the Watson project, said the computer system is a perfect fit for the healthcare field.
"There's so much electronic information out there, and it's projected to continue to grow," Chu-Carroll said. "Nobody can possibly ingest all that information. Without a tool, there's no way to leverage it."
She also said she believes that at some point Watson will have a speech-recognition capability so it can actually go into an exam room and listen to a patient talk about his symptoms while it runs through his medical records.
"Think of some version of Watson being a physician's assistant," Chu-Carroll said. "In its spare time, Watson can read all the latest medical journals and get updated. Then it can go with the doctor into exam rooms and listen in as patients tell doctors about their symptoms. It can start coming up with hypotheses about what ails the patient."
She added, "The physician will make the decisions, but Watson can help."
Doherty said that having a supercomputer that can ingest and analyze loads of data and then answer questions much as a human would could radically change not only medical diagnostics, but also medical research and pandemic recognition and management.
"Spotting trends could save lives and save money," he said. "What humans can't always see, Watson may be able to.
"I think we're on the cusp of a revolution," Doherty added.
Sharon Gaudin covers the Internet and Web 2.0, emerging technologies, and desktop and laptop chips for Computerworld. Follow Sharon on Twitter at Twitter @sgaudin or subscribe to Sharon's RSS feed Gaudin RSS. Her e-mail address issgaudin@computerworld.com.
Read more about Mainframes and Supercomputers in Computerworld's Mainframes and Supercomputers Topic Center.





IBM's Watson Wins Jeopardy! Next Up: Fixing Health Care


The final round in the epic man vs. machine battle that's been playing out on Jeopardy! all week was fought tonight. The winner: Watson, IBM's supercomputer, who soundly defeated flesh-and-blood opponents Ken Jennings and Brad Rutter, ending a three-night tournament and beginning a long period of social and technological commentators trying to figure out what it all meant.
To mark the event, IBM held a viewing party this evening at popular New York ping-pong spot Spin NYC with some of the engineers who helped create the supercomputer. IBM also took the opportunity to announce a collaboration with speech-technology company Nuance to bring Watson like computing and analysis to the healthcare world.
"We're moving beyond Jeopardy!" Dave Ferrucci, the principal engineer behind Watson, said at the event. "With the Watson technology, we're going to look at creating a commercial offering in the next 24 months that will help empower doctors to do higher quality decision making and diagnoses."
Trebek opened the show by talking about what he learned over the past two days, remarking on Watson's propensity to wager seemingly random amounts on Daily Doubles and Final Jeopardy, and joking that Toronto is now a U.S. city, a reference to Watson's bizarre answer in the previoius night's final round. Watson pulled out to an early lead, though Jennings and Rutter soon responded. Jennings came from behind on the first round's Daily Double, pulling ahead to $7,200.
IBM Watson 1Round Two saw Watson cement its lead, but it also revealed again the computer's tendency to bet strange totals on Daily Doubles, wagering just $367 at one point. By the end of the round, though, Watson led the trio with $23,440.
With final Jeopardy, it was Jennings' last chance to win, since Rutter had fallen behind. His correct question of "Who is Bram Stoker" to an answer about 19th Century novelists was accompanied by a message: "I for one welcome our computer overlords." The message was prescient. Watson also had the correct answer, though, winning it the game with $77,973 total. At Spin NYC, the bar erupted into applause at the win.
Ferrucci explained Watson's strange wagers: "They seem random to us mere mortals. What's actually going on there is that the team has trained on human betting patterns. It's considering its confidence. It's also considering where it is in the game, and how much more there is to go. It's a very complex calculation, with very precise results. We could have rounded it, but we figured just give the number."
What IBM does think Watson is good for is data analysis and aiding decision making, which is why the company's approaching the health care field first. The technology has the ability to scan and analyze data from far more sources than a human ever could in a short period of time, potentially aiding doctors in diagnosing complex but urgent conditions.
Going a step further, Ferrucci speculated that at least part of the technology might someday make its way into mobile devices, bringing Watson-like analysis directly to consumers.
"It's so much better on mobile devices to answer succinctly. That could be very helpful. But ultimately it's more of a business intelligence kind of interaction."
Despite the Jeopardy win and the promise of Watson, Ferrucci is careful to point out that his creation is still no substitute for human decision making. After all, when Watson gets a question wrong—as in the Toronto example—it gets it extremely wrong.
"With Jeopardy! these are human questions written for humans, whereas all the computer has are words. It can't rely on human context to determine things.
"I hope people stop and scratch their heads, and think about the limits of computation, and what does it mean to be intelligent," Ferrucci said. "When you deconstruct this, and look at the machine, is any part of this really understanding the question? No. We don't want computers in my opinion making value judgments about what it means to be human. Only humans can do that."

Wednesday, February 16, 2011

The Future of Watson








Final Jeopardy! and the Future of Watson
The IBM team who designed Watson has achieved another milestone in the history of computer science. After the Jeopardy! challenge concludes, the team faces the task of developing real world solutions based on this technology.
The impact of a machine like Watson will be felt throughout business, government and society. Join the conversation to find out how the IBM team achieved this historic feat and chat live with IBM Watson Principal Investigator Dr. David Ferrucci, IBM Fellow and CTO of IBM’s SOA Center for Excellence Kerrie Holley and Columbia University Professor of Clinical Medicine Dr. Herbert Chase, hosted by "Man v. Machine" author Stephen Baker.
To submit questions to the panel, sign on or join Twitter and use the hashtags #ibmwatsonand #askwatson.
Tune in for the webcast on Feb. 17 at 11:30 AM EST.