Tuesday, February 20, 2018

Data Center/ AI Stories You Might Have Missed Last Year



The rapid progress of artificial intelligence (AI) is impacting the global data center industry in multiple ways. Colocation service providers are looking at ways to use artificial intelligence for energy efficiency, server optimization, security, automation, and infrastructure management. As an owner of data centers in New York, Los Angeles, Paris, and other prominent global locations, Telehouse is interested in the advancement of AI in the global data center space. Here are some stories that captured our attention last year. We think these stories will have far-reaching impact.

Data Centers Get AI Hardware Upgrade from Big Hardware Manufacturers

The hardware market for AI-based applications is heating up. Intel, AMD, Microsoft, Google, ARM, and NVIDIA have announced their own specialized hardware targeted at artificial intelligence. Intel unveiled its Nervana Neural Network Processor (NNP) family of chips specifically designed for AI applications in data centers. AMD’s EPYC processor with 32 “Zen” cores, 8 memory channels, and 128 lanes of high-bandwidth I/O is also designed for high-performance computing. Microsoft is experimenting with Altera FPGA chips on their Azure Cloud to handle more AI processing.

Google’s announcement of Tensor Processing Unit (TPU) on the Google Cloud Platform probably received the most press. TPU is optimized for TensorFlow, the open-source application for machine learning. NVIDIA’s graphics cards are already in big demand for machine learning applications. But it has unveiled the Volta GPU architecture for its data center customers.

ARM processors are generally known for their use in low-power mobile devices. But it is taking a stab at the Data Center AI market with two new offerings: Cortex A-75 and Cortex A-55.

With the big names in the hardware industry fighting for dominance, global data centers will have a plethora of hardware choices for AI applications.

Personal Assistants Are Driving the Demand for AI Processing

Amazon Alexa, Google Assistant, Apple Siri and Microsoft Cortana are competing with each other to gain the next-generation of users. As more people start using voice queries and personal assistants, it is changing the dynamics of internet search. The change is significant enough to threaten Google’s dominance. If future users move to voice for daily searches, Google has to rethink their advertising strategy. The winner of the personal assistant battle can end up owning the future of e-commerce.

Artificial intelligence is the backbone of the personal assistant technology. According to a Consumer Intelligence Research Partners (CIRP) survey, Amazon has sold more than 10 million Alexa devices since 2014. Because the personal assistant market is lucrative, innovative startups will try to disrupt the space. And these newcomers will require massive data centers to handle their AI processing needs. As the number of related devices and applications proliferate, the need for global data centers with AI capabilities will also increase.

Big Basin and Facebook

Facebook’s do-it-yourself (DIY) approach to AI hardware might become the model for colocation service providers. Facebook uses artificial intelligence for speech, photo, and video recognition. It also uses AI for feed updates and text translations. So they need hardware that can keep up with their increasing AI requirements.

Big Sur GPU server was Facebook’s first generation AI-specific custom hardware. It was a 4U chassis with eight NVIDIA M40 GPUs and two CPUs with SSD storage. Facebook learned from their experimentation with this hardware configuration. They took that learning and used it to build the next-generation Big Basin architecture. It incorporates eight NVIDIA Tesla P100 GPU accelerator and improves on the Big Sur design. The added hardware and more modular design have given Big Basin a performance boost. Instead of 7 teraflops of single-precision floating-point arithmetic per GPU in Big Sur, the new architecture gets 10.6 teraflops per GPU. Continue reading.

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Thursday, February 15, 2018

Telehouse Introduces Data Center Robotics

data center colocation


The term “robot” translates in Czech to “hard work” or “drudgery.” With advances in technology, data center colocation services include robotics as part of specialized applications that reduce human labor. Primarily, data center colocation providers deploy robotics to enhance efficiency. Facing fierce competition, businesses continually search for ways to make their infrastructures less expensive and agiler. Robotics reduce IT staff, which ensures greater monitoring accuracy and improved security.

Both EMC and IBM currently rely on iRobot Create, which traverses data center colocation facilities to check for fluctuations in temperature, humidity, and system vibrations. After the robot scours a data center colocation site for the source of vulnerabilities, like cooling leaks, it gathers data for processing through a Wi-Fi connection. An algorithm converts the data into a thermal map so that managers can identify anomalies.

Still in the concept phase, PayPerHost is working on Robonodes, which would replace a failed customer server or storage node. Sony and Facebook rely on robotic units as part of Blu-ray disc-based media storage archives. Overall, robotics help businesses mitigate the footprint of data center managed services while simplifying infrastructure.

Telehouse is responding to the increased demand for cloud computing and technological advances. Someday, data center resilience and archiving efficiency will improve due to more robust systems, automation software, and intense planning.

Thursday, January 18, 2018

Algorithms: Smart Yet Slightly Frightening

Colocation hosting providers


In smart cities, as well as data center and colocation facilities, algorithms play a critical role. Algorithms are the reason computer operating systems exist and, therefore, the World Wide Web and Google. For a colocation provider, algorithms make it possible to provide customers a safe and reliable service.

Algorithms also help transform Big Data, initially converting it into analytics and then into an action. Colocation service providers are at the heart of smart cities, with algorithms assisting Data Center Infrastructure Management (or DCIM) tools in predicting cooling problems.

Load balancing algorithms are critical for colocation services, distributing application or network traffic across servers, thereby making them more efficient. There are also smart storage algorithms that process rich media requests, including videos, and cut energy consumption for enterprise-level storage area networks by as much as 50 percent.

In unimaginable ways, algorithms impact both personal and professional lives, which is exciting, yet somewhat unnerving. As an increasing number of businesses adopt and enhance digital solutions, there is a strong chance of seeing more colocation service providers relying on algorithms for storage, computing, and networking.

For organizations with business-critical data, Telehouse provides superior colocation services with 48 data centers worldwide. Ultimately, business owners have peace of mind thanks to high security, redundant power, and flawless interconnection to virtually hundreds of service providers.

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Friday, September 22, 2017

Backup Disaster Recovery Solutions for Smart Cities


Roughly 2.3 billion connected devices exist in smart cities like Tokyo, London, and New York, a whopping 42 percent increase from last year. Therefore, disaster recovery solutions are critical. For bolstering economic growth and enhancing the quality of life, companies must have a viable IT disaster recovery plan in place.

The Internet of Things (IoT) reduces energy consumption for things like water flow regulation and street light operation. However, top disaster recovery service providers predict this footprint will exceed 50 billion connected devices by 2020, creating a tremendous challenge. It is imperative that smart cities have an IT disaster recovery plan as protection against natural and man-made incidents that would make it difficult to use accurate data required to function optimally.

In addition to the risk of being on grids for outages, smart cities get targeted for cyber-attacks. To prevent multiple city services from shutting down from a single-entry point, an all-too-real possibility that would threaten the safety and health of the public, Cloud disaster recovery becomes vital.

Along with 162,000 square feet of colocation space at a dedicated Continuity Recovery site, Telehouse New York Teleport boasts ample offices to accommodate personnel should an adverse event occur.

Friday, September 1, 2017

ALGORITHMS: SCARY SMART AND SOMETIMES JUST PLAIN SCARY

Algorithms, complex mathematical equations designed to solve problems or perform a task, automate much of the technology that makes smart cities smart, from intelligent public transportation and traffic management to smart electrical grids and water usage. Algorithms are also a fundamental tool in transforming Big Data, first into useful analytics, and eventually into action. More on that later.


Data centers and colocation facilities, the pillars of smart cities, are replete with examples of the use of algorithms. Data Center Infrastructure Management (DCIM) tools predict cooling issues based on algorithms built from temperature pattern models. There are load balancing algorithms, which play an important role in distributing network or application traffic across servers, thereby dynamically improving the efficiency of computing resources. And there are smart storage algorithms, which process requests for video and other rich media, and hold the promise of reducing energy use for enterprise-level storage area networks by 20 to 50 percent.

The world’s first and eponymously-titled Euclidean algorithm came to us in 300 B.C. and is still used by computers today. In fact, without algorithms, there would be no computer operating systems, no World Wide Web, and no Google with which to Google “algorithms,” much less the name of that actress who starred in that movie with that guy.

Okay, so now we have your attention.

Getting Too Personal

Today, algorithms are increasingly affecting our personal and professional lives in ways that we can’t imagine or might even find unsettling. Consider the algorithm created by the analytics team at the U.S. retailer Target, which could calculate whether a woman is pregnant and even when she is due to give birth.

In a nutshell, Target, like every retailer, stores a history of every item their customers have bought and any demographic information the company has collected from them. Target analyzed this information against historical buying data for all the women who had ever signed up for its baby registries. The analytics team then created an algorithm that identified 25 products — from unscented lotion to supplements such as calcium and zinc to oversized purses large enough to double as a diaper bag — which, when collectively analyzed, assigned each shopper a “pregnancy prediction” score. More importantly, for direct marketing purposes, its algorithm also estimated a woman’s due date, so that Target could send coupons to customers’ homes timed to specific stages of pregnancy.

And what could be the harm in that? Pregnancy, birth, an impending bundle of joy? Well, some women, families, and especially, teenagers, preferred that their pregnancies remained private. But Target’s predictive algorithm-based marketing hadn’t factored that very human element into their campaign, and trouble ensued.

Every Digital Breath You Take

And then of course there is the U.S. National Security Agency’s XKeyscore program, which was one of the covert projects revealed by Edward Snowden. You may never have heard of XKeyscore, but it definitely has heard of you.

XKeyscore collects every digital breath you’ve ever taken on the Internet, including browsing history, Google searches, the content of your emails and online chats, and at the tap of the keyboard, can process that data through an algorithm to identify potentially subversive activity. The NSA’s own training materials identified XKeyscore as its “widest reaching” system for developing intelligence from the Internet.  Click here to visit original source....

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Thursday, August 31, 2017

FRAILTY, THY NAME IS DATA! – MASTERING DISASTER RECOVERY


Mastering Disaster

From a busted pipe to a fire, your data is vulnerable during a disaster. This is a serious problem, considering that half of businesses that lose data for 10 days or more end up filing for bankruptcy within 6 months. You can avoid this issue by having a data center disaster recovery plan in place. The right disaster recovery strategy will safeguard your data and get you back up and running quickly.

Run Mock Drills

Your data center disaster recovery strategy must include drills. Mock drills prepare your team for a disaster so they can handle the stress during the real thing. This prevents people from panicking, ensuring that they respond effectively. The more drills you run, the better off you will be. Practice turns into habit, so you can build effective habits with mock drills.

Use Advanced Planning and a Clear Chain of Communication

Advanced planning and communication are also essential for your data center recovery. Start with a clear chain of communication so people know exactly who to communicate with during a disaster. You should run mock drills so everyone can practice communicating. In addition, update the chain when staff changes or new technology emerges.

In regards to planning, you must consider what could potentially fail during a disaster and keep spare parts on hand to fix the problem before it occurs. If an emergency occurs, your vendors won’t be able to rush out to you, so having those parts available is essential to getting up and running. In addition, keep up with your maintenance schedule to reduce the chance of failure during a disaster.  Click here to visit original source....

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Monday, August 28, 2017

WREAKING HAVOC: DDOS ATTACKS ARE GROWING MORE FREQUENT, SOPHISTICATED AND COSTLY

9:02 AM Posted by Unknown No comments

Distributed Denial-of-Service (DDoS) attacks, which do not discriminate, have been used to target financial services, healthcare, technology, media and entertainment, software, gaming and a host of other industry sectors. According to Neustar’s Worldwide DDoS Attacks and Protection Report, 73 percent of organizations have suffered a DDoS attack, and 85 percent of attacked businesses have been victims of multiple assaults. Almost half of all DDoS targets run the risk of losing more than $100,000 per hour, with one-third exposed to potential losses of more than $250,000 per hour.

How do those numbers add up over the duration of an attack? According to research by the Ponemon Institute, the average cost of a DDoS attack last year was $4.7 million. Moreover, greater than half of all targets have also suffered a cybersecurity breach while undergoing a DDoS attack.


While the number and severity of DDoS attacks has risen every year, this past year has seen the rise of mega attacks targeting major sites. Twelve attacks have been recorded achieving greater than 100 Gbps throughput, of which five exceeded 200 Gbps.

The most notorious and largest of recent attacks were executed by the Mirai botnet, a collection of more than 100,000 compromised Internet-connected video cameras, consumer routers and other devices that occurred in the fall of 2016. Mirai’s victims included the Republic of Liberia’s Internet infrastructure, as well as U.S. DNS service provider Dyn, which resulted in the inaccessibility of several high-profile websites such as Netflix, Spotify and Airbnb. When one can’t binge-watch their favorite show, listen to R&B classica, or book an affordable room in Paris or Barcelona, something has to change. More on that later.

Cybercriminals have many DDoS-enabled weapons in their quiver. While the Mirai attack directly targeted Dyn’s DNS service, DNS services can also be interrupted by a method known as spoofing, or cache poisoning, whereby corrupt DNS data is used to divert Internet traffic away from the correct server. In fact, one in five DDoS attacks last year were DNS-based.

DNS servers can also be used to generate DDoS traffic with DNS amplification. The cybercriminal sends a DNS query with a forged IP address. The DNS server responds to the forged IP address belonging to the target of the attack. The result is that small queries trigger large responses that can overwhelm the target of the attack.

So, how do companies solve the challenge of quickly detecting and mitigating the pernicious threat of DDoS attacks?  Continue reading....

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com