Monday, November 28, 2011

Data center building, power, and cooling disciplines are not IT disciplines

Data center building, power, and cooling disciplines are not IT disciplines.

Your expertise on applications, software architecture, network, server and storage design is not expertise on building tier IV data centers with 99.995% uptime.

Likewise, experts on mission critical facilities like hardened data center buildings, data center power redundancy and cooling are rarely experts on mission critical systems and applications.

A best-of-breed CIO strategy would include expertise in both information technology systems design and highly available data center facilities. How is this done?

If your organization likes to “roll your own” enterprise data center, you probably hire design/build experts to help you accomplish your goals of high data center uptime. Although the capital costs associated with in-house data centers can be enormous, internal data centers offer the highest level of control.

If your organization is considering outsourcing the facilities disciplines, wholesale colocation offers a simple way to offload the “landlord” side of the data center without losing control of the systems.

It’s often best to outsource data center facilities when you’re great at IT but not so great at building data centers.

Midwest colocation facilities like Lifeline Data Centers offer F5 tornado resistant buildings,N+N power and cooling redundancy, and access to many telecom providers. Midwest data centers offer low power costs also give you peace of mind that you’ve done the best job at solving the data center downtime problem using an affordable colocation solution.

Are you trying to be an expert in both facilities and IT? Talk it over with the mission critical facilities experts.

(Source - http://www.lifelinedatacenters.com)

Sunday, November 27, 2011

Skype, Facebook Expand Video Chatting Capabilities

Posted on Thursday Nov 17th 2011 by Nicholas Kolakowski.

Having been integrated into Microsoft, Skype is now moving ahead with new Facebook integration and some new features for its Mac and Windows versions.

The latest versions of Skype for Mac and Windows now boast the ability to conduct Facebook-to-Facebook calls from within Skype. Starting such a call involves connecting the user's Skype and Facebook accounts, then selecting a Facebook friend with whom to chat.

"This new feature lets you maintain social connections with your Facebook friends and complements previously announced features such as being able to see when your Facebook friends are online," read a Nov. 17 posting on the official Skype blog.

Skype is also smoothing the video-rendering capabilities of Skype 5.4 Beta for Mac, and has added to Skype 5.7 Beta for Windows a group screen-sharing capability for any Windows users with a Premium subscription.

Microsoft purchased Skype for $8.5 billion earlier this year, turning the voice over IP provider into a business division headed by Skype CEO Tony Bates. Microsoft executives have repeatedly announced their intention to tightly integrate Skype's assets with Microsoft products, ranging from Xbox Kinect to Windows Phone, although support for "non-Microsoft client platforms" such as the Mac will apparently continue for the duration.

Microsoft ended up paying far more for Skype than its previous overlord, eBay, which had agreed in 2005 to pay $2.6 billion in cash and stock for the then two-year-old company. Four years later, a team of private investors-including Silver Lake Partners and Andreessen Horowitz-took it off the auction Website's hands for $1.9 billion in cash. Before the Microsoft acquisition, Skype had supposedly been raising money for an initial public offering, but that offering was delayed after the company appointed Bates to the CEO role in October 2010.

Microsoft also has a tightening relationship with Facebook, whose social-networking features (such as the increasingly ubiquitous "Like" button) have been incorporated into the Bing search engine.

Despite the massive Skype acquisition, most of Microsoft's recent corporate activity has centered on partnerships with Facebook, Nokia and the like. This spares Microsoft, despite its considerable financial reservoirs, from having to shell out billions on potentially risky takeovers; however, it also raises the specter of discordance in strategic aims between partners.

(Sources - http://mobile.eweek.com)

From Edison’s Trunk, Direct Current Gets Another Look

Thomas Edison and his direct current, or DC, technology lost the so-called War of the Currents to alternating current, or AC, in the 1890s after it became clear that AC was far more efficient at transmitting electricity over long distances.

Today, AC is still the standard for the electricity that comes out of our wall sockets. But DC is staging a roaring comeback in pockets of the electrical grid.

Alstom, ABB, Siemens and other conglomerates are erecting high-voltage DC grids to carry gigawatts of electricity from wind farms in remote places like western China and the North Sea to faraway cities. Companies like SAP and Facebook that operate huge data centers are using more DC to reduce waste heat. Panasonic is even talking about building eco-friendly homes that use direct current.

In a DC grid, electrons flow from a battery or power station to a home or appliance, and then continue to flow in a complete circuit back to the original source. In AC, electrons flow back and forth between generators and appliances in a precisely synchronized manner — imagine a set of interlocking canals where water continually surges back and forth but the water level at any given point stays constant.

Direct current was the electrical transmission technology when Edison started rolling out electric wires in the 19th century. Alternating current, which operated at higher voltages, was later championed by the Edison rivals Nikola Tesla and George Westinghouse.

The AC forces won when Tesla and Westinghouse figured out how to fine-tune AC transmission so that it required far fewer power plants and copper cable.

DC didn’t die, however.

AT&T adopted direct current for the phone system because of its inherent stability, which is part of the reason that landline phones often survive storms better than the electric grid.

And household appliances and much industrial equipment — everything from hair dryers to jet planes — are built to use DC. Embedded converters bridge the mismatch between the AC grid and the DC devices on the fly.

But those constant conversions cause power losses. For example, in conventional data centers, with hundreds of computers, electricity might be converted and “stepped down” in voltage five times before being used. All that heat must be removed by air-conditioners, which consumes more power.

In a data center redesigned to use more direct current, monthly utility bills can be cut by 10 to 20 percent, according to Trent Waterhouse, vice president of marketing for power electronics at General Electric.

“You can cut the number of power conversions in half,” Mr. Waterhouse said.

On a far smaller scale, SAP spent $128,000 retrofitting a data center at its offices in Palo Alto, Calif. The project cut its energy bills by $24,000 a year.

The revival of DC for long-distance power transmission began in 1954 when the Swedish company ASEA, a predecessor of ABB, the Swiss maker of power and automation equipment, linked the island of Gotland to mainland Sweden with high-voltage DC lines.

Now, more than 145 projects using high-voltage DC, known as HVDC, are under way worldwide.

While HVDC equipment remains expensive, it becomes economical for high-voltage, high-capacity runs over long distances, said Anders Sjoelin, president of power systems for North America at ABB.

Over a distance of a thousand miles, an HVDC line carrying thousands of megawatts might lose 6 to 8 percent of its power, ABB said. A similar AC line might lose 12 to 25 percent.

Direct-current transmission is also better suited to handle the electricity produced by solar and wind farms, which starts out as direct current.

In most situations, solar or wind energy has to be converted, and sometimes reconverted, into AC before it can be used. With HVDC, conversions can be reduced. DC grids can also more easily manage the variable output that occurs, say, when a storm hits or the wind dies.

In the United States, the Tres Amigas power station in New Mexico will use HVDC links to connect the nation’s three primary grids — the eastern, western and Texas grids. Ideally, it will create a marketplace where customers in New York and Los Angeles will be able to buy power from wind farms in Texas, which often have to dump power because of the lack of local demand.

HVDC Light, a version of HVDC invented by ABB in 1997 that is designed for shorter distances, has started to gain popularity because its cables are coated with extruded plastic. That allows cables to be buried underground more easily, avoiding some of the land-use hearings that have delayed proposals for above-ground AC transmission lines in the United States and Europe.

Direct current is also getting more attention at the level of individual buildings.

Nextek Power Systems, for example, has developed a system for delivering power via DC to lights and motion sensors through a building’s metal frame, instead of through wires.

Paul Savage, chief executive of Nextek, based in Detroit, understands why the public might view that notion with trepidation. But he said the current was not enough to electrocute anyone.

“If you licked your fingers you might get a little bubbly feeling, like if you put a nine-volt battery on your tongue, but it is not noticeable if you’re in a non-wet environment,” he said.

Of course, AC remains by far the dominant standard for electricity, and many are dubious about “DC is better” arguments.

Hardware for HVDC and other direct-current applications is expensive, so capital costs have to be recovered through efficiency. Google, never shy about experimenting with energy-saving technologies, has veered away from DC data centers, claiming that the capital costs do not justify the switch.

Still, sales and sales inquiries are climbing, DC advocates said. Just don’t expect Current War II, said Mr. Sjoelin.

“This is a complement,” he said. “We’re not going back to Edison.”

(Source - http://www.nytimes.com)

Thursday, November 24, 2011

Microsoft is officially looking into buying Yahoo

Yahoo and Microsoft sign a nondisclosure agreement, Microsoft officially looking into purchasing the aging internet company.

The Microsoft courtship of Yahoo has officially moved from rumor to confirmed. Microsoft and Yahoo have taken a very early step in the buying process. Microsoft and Yahoo have signed a nondisclosure agreement, which means Microsoft can now poke around at Yahoo’s financials to see if it wants to make an offer.

There have been several recent rumors of Microsoft trying to buy Yahoo, as well as several other possible suitors such as AOL. While other companies may be interested in buying Yahoo none of the other big name companies have signed a nondisclosure. Now that Microsoft is able to look over Yahoo’s books it can decide what exactly it wants to do moving forward.

It still isn’t clear if Microsoft will be buying Yahoo outright or if it will partner with equity firms and make an investment in the search giant. A couple of private equity companies have also signed nondisclosures with Yahoo, but their interest looks to be in making an investment rather than making a purchase.

There are several reasons why Microsoft should be interested in Yahoo, most notably would be search. Yahoo powers the sales behind Microsoft’s Bing search results, and we are sure Microsoft wouldn’t want someone else taking that over. It is also possible that Microsoft will want to intergrate Skype into some of Yahoo’s services, such as yahoo messenger.

No matter what ends up happening Yahoo needs some form of help, either from Microsoft or someone else. Even Yahoo’s iconic San Francisco billboard is feeling the crunch. This is only the first step in the process, and by no means is a for sure signal that Microsoft will actually buy Yahoo.

(Source - http://www.digitaltrends.com)

Sunday, November 20, 2011

Iran detects Duqu virus in Governmental System.

Iran said Sunday that it detected Duqu computer virus, which security players have debated is based on Stuxnet, believed to be aimed at sabotaging Islamic Republic's nuclear sites, according to a report.

Gholamreza Jalali, head of Iran's civil defense organization, told Islamic Republic News Agency (IRNA) news agency that computers at all main sites at risk were being checked and Iran had developed an antivirus software to fight the virus.

"We are in the initial phase of fighting the Duqu virus," Jalali said. "The final report that says which organizations the virus has spread to and what its impacts are has not been completed yet. All the organizations and centers that could be susceptible to being contaiminated are being controlled."

Word on the Duqu computer virus surfaced in October when security vendor, Symantec, said it found a virus which code was similar to Stuxnet, the cyberweapon discovered last year. While Stuxnet was aimed at crippling industrial control systems, security players said Duqu seemed to be designed to gather data so future attacks would be easier to launch.

"Duqu is essentially the precurson to a future Stuxnet-like attack," Symantec said in a report last month, adding that instead of being designed to sabotage an industral control system, the new virus could gain remote access capabilities.

Iran also said in April that it had been targeted by a second computer virus, which it called "Stars". It was not clear if Stars and Duqu were related but Jalali had described Duqu as the third virus to hit Iran.

(Source - http://www.zdnetasia.com)

Saturday, November 19, 2011

Siemon, Cisco, Intel and Aquantia team up to discuss 10GBASE-T adoption in the data centre

At a recent Emerging Technology Forum in Portland USA, experts from leading network infrastructure companies Siemon, Cisco, Intel and Aquantia addressed key advances and considerations in the trend towards increasing market adoption of 10 Gigabit Ethernet (10GBASE-T) technologies in the data centre.

Topics covered were key 10GBASE-T market drivers and projections, the evolution of server connectivity, decreasing power needs and cabling design options with 10GBASE-T, and others. This event offered actionable advice for networking professionals on critical 10GbE decision points across the data centre infrastructure.

Panel contributors included Dave Chalupsky, Intel Network Architect, Carl Hansen, senior product manager with Intel’s Data Centre Standards group, Carrie Higbie, Siemon’s global director of data centre solutions & services, Sudeep Goswami, product line manager of Cisco’s Server Access and Virtualization Business Unit and group chair for the Ethernet Alliance 10GBASE-T committee and Sean Lundy, director of technical marketing at Aquantia.

According to Siemon’s Carrie Higbie, category 6A and higher connectivity is being planned in new data centres, “85% of the new data centre designs we see are cabling for 10GBASE-T.” Higbie also noted a continuing upswing in the global use of shielded cabling for 10GBASE-T, including the traditional UTP dominant markets such as the US.

Siemon has been marketing and selling 10GBASE-T ready cabling since 2004 and now that 10GBASE-T equipment and power consumption is becoming more economical, the time has come for customers to take full advantage of their category 6A and higher cabling investment.

Among the event highlights were Aquantia’s Sean Lundy and Intel’s Carl Hansen and Dave Chalupsky providing insight on how chip innovations from their respective companies were expected to significantly drive down 10GBASE-T power requirements for more energy-efficient 10GbE networks. According to Lundy, “The current 40nm generation can already achieve power of a couple of watts for connectivity within the rack in data centres and will trend to 1 watt or less with energy efficient ethernet and migration to finer geometries. We have now achieved a power, area, density envelope that has enabled dual-port LAN on Motherboard (LOM). Between LOM and 48-port high density switching, in 2011, we will see the beginning of the hockey stick growth curve for 10GBASE-T”.

Regarding widespread commercial availability of 10GBASE-T equipment, Cisco’s Sudeep Goswami stated that Cisco is serious about 10GBASE-T and projected that the company’s flagship Nexus product family would join its Catalyst line in supporting 10GBASE-T in 2011.

(Reference - http://www.thedatachain.com)

How Do Health Information Websites Score on a 100-Point Customer Satisfaction Scale?


Data Point Image


Private-sector health information websites scored a 79 on a 100-point customer satisfaction scale, while health insurance websites scored a 51, according to a study by ForeSee, a customer experience analytics firm.


Public-sector health information websites scored a 78 on the scale, according to the study. The study defined public-sector health information websites as those maintained by the federal government and not-for-profit organizations. Meanwhile, hospital and health system websites scored a 78.


Two sub-categories of private-sector health information websites -- sites that included information about pharmaceuticals and health products -- both scored a 76.


The study also found that health information website visitors who give a satisfaction rating of 80 or higher say they are 127% more likely to use the site as their main resource for interacting with a health care organization.


Results are based on an analysis of 100,000 surveys conducted from August to September 2011.


Source: ForeSee, "The 2011 ForeSee Healthcare Benchmark"



Read more: http://www.ihealthbeat.org/data-points/2011/how-do-health-information-websites-score-on-a-100-point-customer-satisfaction-scale.aspx#ixzz1e7MvNDBv

Friday, November 18, 2011

The toilet, re-imagined: four water-saving designs.

Saturday is World Toilet Day, an annual awareness-raising campaign sponsored by the World Toilet Organization and aimed at improving santitation access for the 2.6 billion people around the world who lack it.

A number of NGOs are working to develop low-cost, low-power systems to address the public health dangers and environmental degradation that comes from poor sanitation in the developing world.

But there’s also a fair amount of innovation underway to improve the design and efficiency of the conventional flush toilets. Herewith, a quick survey of some of these re-imagined toilets.

More than dual flush

Old-school toilets can use as much as five gallons — five gallons! — with each flush. To reduce that obvious waste of a precious resource, a number of manufacturers are offering dual-flush toilets. One lever is pushed for a “number one” and the other lever — which sends markedly more water into the bowl — is used to flush poop.

But Caroma, an Australian manufacturer of commercial and residential bathroom products, has one-upped the dual flush toilet with its Profile Smart 305.

Yes, that is a sink you’re seeing, integrated into the toilet. Here’s the way it works. After you do your business, you use the sink to wash your hands. The sink uses fresh water, but that water is then stored in the tank as grey water. And then when the toilet is flushed, it uses the grey water instead of more fresh water.

The unit also includes a dual-flush, so it is already designed for efficiency.

But the use of the integrated sink only serves to boost the efficiency by eliminating the need to pump fresh water into the bowl.

And think about that for a minute: fresh, clean water, straight from a wastewater treatment facility, is pumped into the billions of flush toilets in around the world. There’s no reason for that and it not only wastes the water, it wastes the considerable energy that went into cleaning and delivering that water to the building in the first place.

But the ergonomics on this model are a bit funky. Seems like it would be easier to use if the sink was positioned perpendicular to the bowl…and maybe a was a tad bigger.

And that’s just what the Spanish company Roca did with its W+W (for wash basin and water closet) design. As with the Caroma, the water that goes down the sink’s drain is collected, filtered, and then used to flush the toilet. But with the sink facing out from the toilet seat, it has a more separated look and, I would imagine, feel. In fact, it wouldn’t feel odd to use the Roca basin for, saying, tooth-brushing.

The Roca model also uses an energy-saving faucet design. The handle always opens the faucet into a cold water position, so that the user can’t inadvertently create demand for hot water unless he or she really wants to.

The DIY approach

But replacing a perfectly good toilet with the a new one like the Caroma or Roca is a bit wasteful in its

own right. But fortunately there are ways to recycle grey water for your sanitary needs through retrofits. Kentucky-based WaterSaver Technologies sells a solution call AQUS which collects water from your bathroom sink and pipes it into the water reservoir in your toilet. The company says the sink doesn’t need to be located right next to the toilet for the system to work, because it can pipe the water across the room and into the toilet.

There are a couple down-sides to this approach, though. The holding tank eats up storage space under the sink, and the system needs to power to operate, so you’ll need a spare outlet.

If space is as great a concern as water savings, there’s this Yanko Design solution that combines not only the sink and toilet into a single unit, but tosses in a mirror, a mini table and a….espresso cup holder? That’s what this design appears to offer:

In any case, the centuries-old flush toilet design is getting an overhaul, with an eye toward water and energy conservation. And that’s a good thing.

Photos: Flickr/Britt Selvitelle, Caroma, Roca, WaterSaver, Yanko Design

(Reference - http://www.smartplanet.com)

Sunday, November 13, 2011

Soalan Lazim PFI

PPP

Soalan 1: Dalam konteks Malaysia. Apakah PPP?

Jawapan: Pembiayaan Inisiatif Swasta (PFI) atau Perkongsian Awam-Swasta (PPP) yang telah diumumkan di dalam Rancangan Malaysia Kesembilan (RMke-9) merupakan satu lagi kaedah perolehan projek Kerajaan. Ia merujuk kepada pemindahan tangggungjawab pembiayaan dan pengurusan pelaburan modal dan perkhidmatan seperti pembinaan, pengurusan, penyenggaraan, pembaikan dan penggantian aset-aset Kerajaan seperti bangunan, infrastruktur, peralatan dan kemudahan lain kepada sektor swasta yang mana ia akan mewujudkan stand-alone business. Sebaga balasan, sektor awam akan membayar bagi perkhidmatan yang diberikan oleh sektor swasta. Dalam sesetengah projek PPP, pembayaran dibuat sendiri oleh pengguna kepada penyedia perkhidmatan. Terma PFI dan PPP digunakan saling bertukar ganti, namun dalam konteks Malaysia, PFI adalah subset kepada PPP.

Soalan 2: Bagaimanakah pembayaran dibuat kepada sektor swasta di bawah kaedah PPP?

Jawapan: Pembayaran dibuat berdasarkan prestasi penyampaian perkhidmatan syarikat yang mana Kerajaan akan hanya membuat bayaran sekiranya sesuatu perkhidmatan yang disampaikan itu menepati tahap perkhidmatan yang telah dipersetujui (pre-agreed service level) atau Petunjuk Prestasi Utama (Key Performance Indicator- KPI) yang telah ditetapkan.

Soalan 3: Bagaimanakah sekiranya saya ingin mengemukakan cadangan projek untuk pertimbangan?

Jawapan: Sebarang cadangan projek haruslah dikemukakan terus kepada kementerian/agensi yang berkaitan.

Soalan 4: Apakah kriteria syarikat yang ingin terlibat dalam projek PPP?

Jawapan: Antara kriteria yang diperlukan adalah syarikat peneraju perlu mempunyai kedudukan kewangan yang mantap serta mempunyai sumber kepakaran pengurusan dan teknikal yang berkaitan. Di samping itu syarikat SPV terbabit perlu mempunyai nilai modal berbayar sekurang-kurangnya 10% daripada nilai projek.

Soalan 5: Mengapakah Kerajaan menubuhkan satu lagi Unit PPP di bawah Jabatan Perdana Menteri baru-baru ini? Adakah sebarang tujuan tertentu penubuhan Unit baru ini?

Jawapan: Penubuhan Unit ini membuktikan kesungguhan Kerajaan dalam meningkatkan sistem penyampaian dalam bentuk proses, prosedur berkaitan kejayaan pelaksanaan projek PPP. Dengan wujudnya Unit ini di bawah Jabatan Perdana Menteri, aspek kawalan dan perundangan dalam projek PPP akan lebih diberi tumpuan bagi memastikan kejayaan pelaksanaan projek PPP. Ia juga menunjukkan bahawa Kerajaan serius dalam membantu perniagaan menerusi peningkatan kecekapan dan keberkesanan sistem penyampaian awam dan penghargaan Kerajaan terhadap peningkatan peranan sektor swasta dalam ekonomi negara.

Soalan 6: Apakah yang dimaksudkan Value For Money (VFM) dalam perolehan melalui kaedah PPP?

Jawapan: VFM merujuk kepada kombinasi yang optima kos sepanjang hayat (whole life cycle cost) dan kualiti bagi memenuhi keperluan pelanggan dan bukan merupakan pilihan perkhidmatan berdasarkan kos yang terendah.

Soalan 7: Bagaimanakah Value For Money (VFM) dapat dicapai melalui PPP?

Jawapan: VFM dapat dicapai melalui agihan risiko yang optima di antara sektor awam dan sektor swasta, kontrak jangka panjang yang merangkumi kos sepanjang hayat (life cycle costing), penggunaan spesifikasi output yang menggalakkan inovasi, persaingan yang kompetitif, mekanisme pembayaran berdasarkan pencapaian dan kemahiran serta kepakaran sektor swasta yang terlibat.

Soalan 8: Apakah yang dimaksudkan dengan pengagihan risiko yang optima dalam perolehan melalui kaedah PPP?

Jawapan: Pengagihan risiko yang optima bermaksud risiko bagi sesuatu projek diperuntukkan kepada pihak yang dapat menguruskannya dengan cara paling baik. Sebagai contoh, risiko reka bentuk dan pembinaan adalah diperuntukkan kepada sektor swasta dan pembayaran hanya dibuat oleh sektor awam apabila sesuatu perkhidmatan itu telah tersedia (available).

Soalan9: Adakah projek berkaitan teknologi maklumat dan komunikasi (ICT) sesuai dilaksanakan melalui kaedah PPP?

Jawapan: Projek yang berkaitan ICT tidak sesuai dilaksanakan melalui kaedah PPP kerana perubahan yang kerap dan dalam masa pendek teknologi berkenaan.

Soalan 10: Apakah perbezaan di antara kaedah perolehan secara konvensional, PPP dan penswastaan?

Jawapan: Perbezaan di antara ketiga-tiga kaedah perolehan berkaitan adalah seperti di Jadual berikut:

Konvensional

PPP

Penswastaan

Perolehan dibiayai terus daripada bajet Kerajaan

Pembiayaan daripada sumber kewangan pihak swasta tanpa jaminan Kerajaan.

Pembiayaan daripada sumber kewangan swasta tanpa jaminan Kerajaan

Impak secara langsung ke atas kedudukan kewangan sektor awam.

Impak ke atas bajet awam diagihkan sepanjang tempoh konsesi.

Tiada kesan terhadap perbelanjaan sektor awam.

Risiko ditanggung sepenuhnya oleh sektor awam.

Pengagihan risiko kepada pihak yang dapat menguruskannya secara berkesan.

Risiko ditanggung sepenuhnya oleh sektor swasta

Penglibatan meluas sektor awam di setiap peringkat sepanjang jangka hayat projek

Pembabitan sektor swasta melalui penguatkuasaan KPI yang telah dipersetujui

Kerajaan bertindak sebagai badan kawal selia

Kontrak jangka pendek dengan pihak swasta

Kontrak jangka panjang dengan pihak swasta

Kontrak jangka panjang dengan pihak swasta

Sesuai untuk projek yang mempunyai pulangan sosio-ekonomi yang tinggi dan pertimbangan strategik.

Sesuai untuk projek yang mempunyai daya maju komersial

Sesuai untuk projek yang mempunyai daya maju komersial yang tinggi