Tuesday, February 20, 2018

Data Center/ AI Stories You Might Have Missed Last Year



The rapid progress of artificial intelligence (AI) is impacting the global data center industry in multiple ways. Colocation service providers are looking at ways to use artificial intelligence for energy efficiency, server optimization, security, automation, and infrastructure management. As an owner of data centers in New York, Los Angeles, Paris, and other prominent global locations, Telehouse is interested in the advancement of AI in the global data center space. Here are some stories that captured our attention last year. We think these stories will have far-reaching impact.

Data Centers Get AI Hardware Upgrade from Big Hardware Manufacturers

The hardware market for AI-based applications is heating up. Intel, AMD, Microsoft, Google, ARM, and NVIDIA have announced their own specialized hardware targeted at artificial intelligence. Intel unveiled its Nervana Neural Network Processor (NNP) family of chips specifically designed for AI applications in data centers. AMD’s EPYC processor with 32 “Zen” cores, 8 memory channels, and 128 lanes of high-bandwidth I/O is also designed for high-performance computing. Microsoft is experimenting with Altera FPGA chips on their Azure Cloud to handle more AI processing.

Google’s announcement of Tensor Processing Unit (TPU) on the Google Cloud Platform probably received the most press. TPU is optimized for TensorFlow, the open-source application for machine learning. NVIDIA’s graphics cards are already in big demand for machine learning applications. But it has unveiled the Volta GPU architecture for its data center customers.

ARM processors are generally known for their use in low-power mobile devices. But it is taking a stab at the Data Center AI market with two new offerings: Cortex A-75 and Cortex A-55.

With the big names in the hardware industry fighting for dominance, global data centers will have a plethora of hardware choices for AI applications.

Personal Assistants Are Driving the Demand for AI Processing

Amazon Alexa, Google Assistant, Apple Siri and Microsoft Cortana are competing with each other to gain the next-generation of users. As more people start using voice queries and personal assistants, it is changing the dynamics of internet search. The change is significant enough to threaten Google’s dominance. If future users move to voice for daily searches, Google has to rethink their advertising strategy. The winner of the personal assistant battle can end up owning the future of e-commerce.

Artificial intelligence is the backbone of the personal assistant technology. According to a Consumer Intelligence Research Partners (CIRP) survey, Amazon has sold more than 10 million Alexa devices since 2014. Because the personal assistant market is lucrative, innovative startups will try to disrupt the space. And these newcomers will require massive data centers to handle their AI processing needs. As the number of related devices and applications proliferate, the need for global data centers with AI capabilities will also increase.

Big Basin and Facebook

Facebook’s do-it-yourself (DIY) approach to AI hardware might become the model for colocation service providers. Facebook uses artificial intelligence for speech, photo, and video recognition. It also uses AI for feed updates and text translations. So they need hardware that can keep up with their increasing AI requirements.

Big Sur GPU server was Facebook’s first generation AI-specific custom hardware. It was a 4U chassis with eight NVIDIA M40 GPUs and two CPUs with SSD storage. Facebook learned from their experimentation with this hardware configuration. They took that learning and used it to build the next-generation Big Basin architecture. It incorporates eight NVIDIA Tesla P100 GPU accelerator and improves on the Big Sur design. The added hardware and more modular design have given Big Basin a performance boost. Instead of 7 teraflops of single-precision floating-point arithmetic per GPU in Big Sur, the new architecture gets 10.6 teraflops per GPU. Continue reading.

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Thursday, February 15, 2018

Telehouse Introduces Data Center Robotics

data center colocation


The term “robot” translates in Czech to “hard work” or “drudgery.” With advances in technology, data center colocation services include robotics as part of specialized applications that reduce human labor. Primarily, data center colocation providers deploy robotics to enhance efficiency. Facing fierce competition, businesses continually search for ways to make their infrastructures less expensive and agiler. Robotics reduce IT staff, which ensures greater monitoring accuracy and improved security.

Both EMC and IBM currently rely on iRobot Create, which traverses data center colocation facilities to check for fluctuations in temperature, humidity, and system vibrations. After the robot scours a data center colocation site for the source of vulnerabilities, like cooling leaks, it gathers data for processing through a Wi-Fi connection. An algorithm converts the data into a thermal map so that managers can identify anomalies.

Still in the concept phase, PayPerHost is working on Robonodes, which would replace a failed customer server or storage node. Sony and Facebook rely on robotic units as part of Blu-ray disc-based media storage archives. Overall, robotics help businesses mitigate the footprint of data center managed services while simplifying infrastructure.

Telehouse is responding to the increased demand for cloud computing and technological advances. Someday, data center resilience and archiving efficiency will improve due to more robust systems, automation software, and intense planning.