Showing posts with label global data centers. Show all posts
Showing posts with label global data centers. Show all posts

Tuesday, May 1, 2018

Understanding The Role of Artificial Intelligence in The Data Center Industry

The amount of global data being stored, processed and managed continues to grow exponentially each day. In turn, artificial intelligence is playing a pivotal role in helping data center service providers capture, process and analyze this data at a faster and more powerful rate than ever before. From automated monitoring systems to advanced energy savings, here’s how artificial intelligence is improving the operations and efficiency of global data centers.

What Artificial Intelligence Means for Data Center Service Providers

Artificial intelligence isn’t a new concept, and tools like face detection and voice recognition already play a major role in our daily lives. Strava, Inc. Staff Engineer Drew Robb adds that object identification, classification, and other forms of geographic and identity detection are leading AI uses in the enterprise market.



All of these applications place an increased strain on data centers because they require increased data storage and processing in order to run. Managing this immense increase in data requires that the data center industry scale, adapt, and acquire more computing power. Artificial intelligence enables the data center service provider to meet such demands in a variety of ways, including operational automation, elastic computing power and predictive maintenance.

Improving Data Center Efficiency

Increased data processing requires that data centers keep hardware cool. With more data to process and hardware working harder, however, this drives up energy costs and increases the overall resource footprint of data centers.

Fortunately, machine learning is playing a vital role in helping companies understand their data center energy consumption. As explained in Datacenter Dynamics, artificial intelligence is being used to analyze temperature set points, evaluate cooling equipment and test flow rates. The use of AI-powered smart sensors can receive data from numerous sources and relay that information as environmental, electrical and mechanical insights. In addition to detecting sources of energy inefficiencies, machine learning can also be automated to make informed decisions that reduce data center energy consumption and cut costs.

Software solutions business manager Stefano D’Agostino adds that, “innovative startups are using intelligent machines with self-learning algorithms to optimize the allocation of the IT load itself so that optimal cooling can be achieved.” The benefits of such technology is already being realized, and statistics from The Data Center Science Center show that advancements in UPS efficiency and cooling energy losses have helped ordinary data centers cut physical infrastructure costs by 80% over the last decade.

This shows that, even though artificial intelligence technology is partly responsible for an increase in data center processing, it can also be used to mitigate its own increases in energy consumption.

Strengthening Data Center Security

In addition to improving energy efficiency, AI can also improve security of a data center. New York businesses rely on Telehouse because we’re committed to proactively managing customer data and reducing security risks wherever possible. We’re also tuned in to the latest advancements in AI security applications, which can screen and analyze data for security threats at a more thorough and rapid rate. AI can also help assess normal and abnormal patterns, detect malware and spam, identify weak areas and strengthen protection from potential threats.

Detecting and Reducing Downtime
Another way that artificial intelligence can influence the modern data center service provider is through improved outage monitoring. In fact, AI monitors have the ability to predict and detect data outages before they even occur. They also have the ability to track and detect server performance, disk utilization, and network congestions.

Today, artificial intelligence offers advanced predictive analytics services that make it easier and more reliable to monitor power levels and potential trouble areas. Click here to visit original source....

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Tuesday, February 20, 2018

Data Center/ AI Stories You Might Have Missed Last Year



The rapid progress of artificial intelligence (AI) is impacting the global data center industry in multiple ways. Colocation service providers are looking at ways to use artificial intelligence for energy efficiency, server optimization, security, automation, and infrastructure management. As an owner of data centers in New York, Los Angeles, Paris, and other prominent global locations, Telehouse is interested in the advancement of AI in the global data center space. Here are some stories that captured our attention last year. We think these stories will have far-reaching impact.

Data Centers Get AI Hardware Upgrade from Big Hardware Manufacturers

The hardware market for AI-based applications is heating up. Intel, AMD, Microsoft, Google, ARM, and NVIDIA have announced their own specialized hardware targeted at artificial intelligence. Intel unveiled its Nervana Neural Network Processor (NNP) family of chips specifically designed for AI applications in data centers. AMD’s EPYC processor with 32 “Zen” cores, 8 memory channels, and 128 lanes of high-bandwidth I/O is also designed for high-performance computing. Microsoft is experimenting with Altera FPGA chips on their Azure Cloud to handle more AI processing.

Google’s announcement of Tensor Processing Unit (TPU) on the Google Cloud Platform probably received the most press. TPU is optimized for TensorFlow, the open-source application for machine learning. NVIDIA’s graphics cards are already in big demand for machine learning applications. But it has unveiled the Volta GPU architecture for its data center customers.

ARM processors are generally known for their use in low-power mobile devices. But it is taking a stab at the Data Center AI market with two new offerings: Cortex A-75 and Cortex A-55.

With the big names in the hardware industry fighting for dominance, global data centers will have a plethora of hardware choices for AI applications.

Personal Assistants Are Driving the Demand for AI Processing

Amazon Alexa, Google Assistant, Apple Siri and Microsoft Cortana are competing with each other to gain the next-generation of users. As more people start using voice queries and personal assistants, it is changing the dynamics of internet search. The change is significant enough to threaten Google’s dominance. If future users move to voice for daily searches, Google has to rethink their advertising strategy. The winner of the personal assistant battle can end up owning the future of e-commerce.

Artificial intelligence is the backbone of the personal assistant technology. According to a Consumer Intelligence Research Partners (CIRP) survey, Amazon has sold more than 10 million Alexa devices since 2014. Because the personal assistant market is lucrative, innovative startups will try to disrupt the space. And these newcomers will require massive data centers to handle their AI processing needs. As the number of related devices and applications proliferate, the need for global data centers with AI capabilities will also increase.

Big Basin and Facebook

Facebook’s do-it-yourself (DIY) approach to AI hardware might become the model for colocation service providers. Facebook uses artificial intelligence for speech, photo, and video recognition. It also uses AI for feed updates and text translations. So they need hardware that can keep up with their increasing AI requirements.

Big Sur GPU server was Facebook’s first generation AI-specific custom hardware. It was a 4U chassis with eight NVIDIA M40 GPUs and two CPUs with SSD storage. Facebook learned from their experimentation with this hardware configuration. They took that learning and used it to build the next-generation Big Basin architecture. It incorporates eight NVIDIA Tesla P100 GPU accelerator and improves on the Big Sur design. The added hardware and more modular design have given Big Basin a performance boost. Instead of 7 teraflops of single-precision floating-point arithmetic per GPU in Big Sur, the new architecture gets 10.6 teraflops per GPU. Continue reading.

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Friday, July 21, 2017

GLOBAL SPOTLIGHT: INTERNET AND DATA CENTER GROWTH IN RUSSIA

Multinationals Seeking a Commercial Presence in Russia Push the Market Forward



With nearly 74 million users, Russia is Europe’s largest internet market. Given that figure, the metrics surrounding Russia’s data center industry are somewhat ambiguous. Consider that the country’s commercial data center market reached a modest $303 million in 2014, but has been growing at approximately 25 percent per year over the last five years, according to Direct INFO, a research consultancy.

In fact, as recently as eight years ago there were only half a dozen Tier I to Tier II commercial data centers in the entire country and these were largely operated by systems integrators. At the time, Russia’s technology talent pool lacked the necessary skillsets to build and operate modern data centers.

Today, however, Russia has no fewer than 180 data centers, most which are in Moscow. Sixteen of the 20 largest data centers in the country operate in the capital, each of which contains more than 1,000 racks and an average total capacity of 12 MW. Over the next several years, that number is anticipated to grow due to a confluence of factors, and not just in Moscow.

Government Regulations and Global Business Drive Growth

The data center colocation market, in particular, is being stimulated by government legislation, passed in September 2015, which forbid the storage of Russian citizens’ personal data on servers located abroad. Multinational and Russian financial institutions, as well as insurance and investment companies, are also facing new, more stringent regulations on international activity, which will increase the demand for premium data center services.

The other main drivers of the Russian colocation sector include a steady rise in demand for new white space, a growing interest among Russian enterprises in outsourced data center strategies, and an increasing number of international service providers and enterprises looking to establish a commercial presence in Russia.

With the development of enterprise branch networks, it also becomes desirable for companies to centralize the processing and storage of data using complex business applications, for example, ERP-and CRM-systems. Hence, commercial data centers will increasingly be used to centralize the IT infrastructures of global companies. Moreover, the use of commercial data centers will allow multinational firms to ensure the continuity of their business due to their high reliability.

On the Edge and in the Cloud

The owners of large-scale web projects, including search engines, web portals and social networks that generate a significant amount of traffic and number of users, also seek to locate their equipment closer to the end-user, or on the edge of the network, to reduce the costs of data transfer. These web-scale players are specifically interested in regional data centers. Visit original source....

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

THE “NEW” NEW YORK

How New York Is Evolving Into a World-Class Smart City


Simon Sylvester-Chaudhuri is a Managing Partner at Global Fortunes Group, spearheading various products and programs that drive urban innovation. As an advocate for technological innovation around the world, Simon is passionate about smart city development on a global scale and has worked with multiple world-class cities throughout Europe and the Middle East. We recently had the opportunity to interview Simon to discuss the current state of New York as a smart city as well as the policies that are driving the technological advancements that will define the “new” New York.

Coming to America

While the concept of smart cities has been in practice throughout Europe for nearly a decade, this trend has only taken hold in the United States over the past two to three years across a limited number of major metro areas, including New York, Chicago, Atlanta and San Francisco. In New York, the conversion into a technologically advanced smart city is predominantly driven by government programs and citizen engagement.

“One of the key drivers of innovation that I’ve experienced in New York is the willingness of policy-makers, privately-held enterprises and general citizens to work together to create a smarter and more advanced city,” shared Simon. “We’re not only focusing on the technology aspect, but also creating new ways to engage citizens and organizations with innovation labs and government programs. These provide an element of inclusiveness that is unique to New York, enabling intelligent discussion and action.”

As a testament to that commitment, the Mayor’s Office of Technology and Innovation (MOTI) has laid the groundwork for continued innovation by providing the necessary resources for a variety of projects, including the conversion of the historic Brooklyn Navy Yard into a model of modern, urban technological development. This industrial park also serves as home to the New Lab, one of the world’s leading technology hubs.

“In the push toward technological innovation, major universities such as Cornell, Columbia, NYU and CUNY are also getting involved in a big way,” Simon added. “Universities throughout this region are driving multiple initiatives to collect data that will help develop programs for enhanced urban and scientific progress as well as sustainability. One such program is the Urban Future Lab at the NYU Tandon School of Engineering, which hosts several programs focused on education, policy and market solutions to solve the challenge of sustainability in smart cities.”  Click here to visit original source

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Tuesday, May 16, 2017

TELEHOUSE GLOBAL SPOTLIGHT: OTT VIEWERSHIP IS FAST BECOMING OVER THE TOP

Colocation Provides a Solution to OTT Performance


Over-the-Top (OTT), in telecom parlance, refers to an app or service that delivers content such as streaming video and audio over the internet rather than traditional cable or satellite distribution. According to the 2017 OTT Video Services Study conducted by Level 3 Communications, viewership of OTT video services, including Netflix, Hulu and Amazon Prime, will overtake traditional broadcast TV within the next five years. Meanwhile, Juniper Research predicts that the global OTT market will increase to $18 billion in 2019, up from $8.5 billion just three years ago.

Additionally, it’s worthy of note that the audience for OTT content is growing not only in total viewership and revenue, but geographically. Last year, Netflix tripled its global reach by expanding into an additional 130 countries as the video streaming service took its most aggressive step yet in its plans for international growth.

The reason for the surge in OTT viewership lies in immediate gratification: People want what they want when they want it. OTT allows viewers to consume content whenever and wherever they desire on their preferred device. Particularly for millennials, appointment TV is now widely considered a legacy entertainment model.

Supporting the increasing volume of streaming video requires solutions to the hosting, delivery, bandwidth and performance challenges that all too frequently frustrate the Quality-of-Service and experience of online video viewers. Whether at the source or along the last mile, insufficient bandwidth creates interruptions that result in dreaded buffering pauses. Content providers address bandwidth challenges by compressing data and bringing content closer to users by placing the data on edge servers in strategically located data centers and colocation facilities around the world. However, in order for OTT players to successfully reach their audience, it’s critical to collocate within data centers capable of providing low-latency connectivity to end users throughout their target geographic regions. Click here to visit original source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Wednesday, May 10, 2017

SMART CITIES: ENABLING A BETTER QUALITY OF LIFE

Defining the Phenomenon Sweeping the World’s Major Metropolitan Areas


Toward the end of the last decade, the number of humans inhabiting the Earth crossed the seven billion mark, with the majority of people living in metropolitan areas. According to the World Health Organization, urban residents account for 54 percent of the total global population – a number that is expected to grow nearly two percent each year until 2020. As a result, it’s become critical to establish greener and more efficient technology in major metropolises.

Once the realm of science-fiction, today, “smart cities” are being established around the world, transforming how we live through the use of innovative technology and analytics. A smart city is defined by the integration of information and communication technology (ICT) and IoT devices to manage a variety of public assets, while acquiring critical data on an ongoing basis to improve the lives of its citizens.

According to research firm Frost and Sullivan, there are eight elements that comprise a smart city, including smart governance and education, smart healthcare, smart building, smart mobility, smart infrastructure, smart technology, smart energy and smart citizen. Cities that successfully integrate at least five of these eight markers receive the distinction of being a smart city. In addition, Frost and Sullivan estimates a combined market potential of $1.5 trillion globally across these various smart city categories.

The core focus of the world’s smart cities is to enable a better quality of life for all residents and visitors. Utilizing smart technology, officials gain enhanced visibility into the inner workings of the city on a granular level, enabling them to identify services needed by citizens. For example, in New York City, the LinkNYC program is in the process of transforming 7,500 former payphones into digital hubs, providing the public with free Wi-Fi, charging ports, and access to directions and city services. Boston has implemented solar-powered benches that not only allow users to charge their mobile devices, but send environmental data to government officials via advanced sensors in hopes of improving the city’s livability. Click here to visit original source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Friday, April 21, 2017

TELELHOUSE TUTELAGE: PEERING 101

Understanding Network Interconnectivity

http://www.telehouse.com/solutions/connectivity/peering/

Peering, simply defined, is the interconnection of two different networks, allowing them to directly exchange traffic between one another, and organizations to reach each other’s customers. Public peering is performed across a shared network and relies upon Internet Exchanges (IXs) to function and deliver content across the world. An Internet Exchange is an Ethernet switch or set of Ethernet switches in a colocation facility, to which all networks peering in the facility can connect. Using an IX, a network can cost-effectively peer with hundreds of other networks through a single connection.

Private peering within a colocation facility involves two networks putting routers in the same building and running a direct cable between them rather than connecting via the exchange point switch. This is common when the networks are exchanging a large volume of traffic that won’t fit on a shared connection to an exchange point.

Most major population centers have an Internet Exchange. Because of the significant network density available in these key locations, a variety of businesses, including cloud and content providers, financial services companies, global enterprises and public sector agencies choose to deploy their infrastructure within these facilities. This allows them to leverage the direct interconnection options available by cross-connecting with multiple network providers. Peering arrangements need to be negotiated with each peer, but no new cabling needs to be deployed, unlike private peering. Visit Original Source....


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Wednesday, February 22, 2017

TELEHOUSE FOR TECHNOPHILES: THE BRAIN IN THE MACHINE

How Data Centers Are Using Deep Learning

Machine and deep learning, which emerged from Artificial Intelligence (AI), the theory and development of computer systems that can perform tasks normally requiring human intelligence, mimics activities in layers of neurons in the neocortex, the area of the brain where thinking occurs. Deep learning software can be programmed to recognize patterns in digital representations of sounds, images and other data. In fact, machine intelligence is transforming the future of everything from communications to healthcare, and from manufacturing and transportation to advanced robotics. Writers and filmmakers such as Arthur C. Clarke and Stephen Spielberg have foretold of a brave new world where AI will take one day influence every waking aspect of our personal and professional lives.

Science-fiction aside, machine learning is already well-established in our everyday world, from your faithful companion Siri to facial recognition programs to language translation. But it can also help to tackle some of the world’s most challenging industrial problems, such as rampant energy consumption that adversely impacts the environment. Large-scale commercial systems, including high performance data centers, consume a lot of energy, and while much has been done to mitigate energy usage in enterprise and colocation facilities, deep learning can do much more to manage the world’s increasing need for high performance computing power. Curious to know more view original source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Wednesday, February 8, 2017

TELEHOUSE FOR TECHNOPHILES: TOURING THE DATA CENTER OF THE FUTURE

What Will Data Centers Look Like in 2017 and Beyond?

In previous Telehouse for Technophiles blogs, we’ve looked at present-day, advanced technologies affecting the data center, such as adiabatic cooling, the increased usage of Deep Machine Learning and the proliferation of Big Data analytics. But what changes can we anticipate in 2017 and beyond?

Let’s explore various predictions concerning design, operational and technological advances in the data center, as well as some of the market drivers that we can expect will influence the industry in the coming year and into the future.
TELEHOUSE FOR TECHNOPHILES: TOURING THE DATA CENTER OF THE FUTURE


Introducing the Skyscraper Data Center

Eschewing current designs in which data centers are low and sprawling, two European architects, Marco Merletti and Valeria Mercuri, have proposed a data center rising 65-stories tall. While only in the blueprint phase, the futuristic, tower-like structure would feature sustainable technology to cool hundreds of thousands of servers and be powered by geothermal energy and hydropower. The data center’s cylindrical design would create a chimney effect whereby the hot air inside the tower goes up and sucks the cold air from the outside, and the outside cold air would reenter through servers arranged in pod units that would cool naturally. Curious to know more view original source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Monday, January 16, 2017

Telehouse for Technophiles: Robots Enter the Data Center


Robots Promise Energy and Operational Cost Reduction

The word ‘robot’ was first heard in a 1920 Czechoslovakian theater production to describe human-like machines that were created to work in a factory. Thereafter, including Fritz Lang’s 1927 sci-fi masterpiece, “Metropolis,” with its female robot ‘Hel,’ to Steven Spielberg’s 2001 movie, “A.I.,” and its Mecha, advanced androids capable of emulating human thoughts and emotions, to Arnold Schwarzenegger’s “Terminator” franchise, robots have appeared in films, on television and in literature as agents of good and evil.

While the definition of the word robot means “drudgery” or “hard work” in Czech, organizations today are applying robotics to specialized data center applications in an effort to eliminate the sweat, cost and toil associated with day-to-day facility administration.

The biggest business driver to deploying robotics in the data center is the need for greater levels of efficiency. Companies are always looking for ways to make their IT infrastructures more agile and less costly, and robotics has been identified as a means to add more automation to achieve these goals. Additional potential benefits include IT staff head-count reduction, improved security and the increased accuracy of monitoring that automation brings. Click here for more details.


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Web: http://www.telehouse.com/
Email: gregory.grant@telehouse.com

Tuesday, January 10, 2017

Telehouse Green: Any Way the Wind Blows

Data Centers are Harnessing the Power of Wind Energy


More than eight million data centers exist around the world, using upwards of 30GW of energy each year, an amount that is steadily increasing. A study by the National Resource Defense Council (NRDC) revealed that if planet’s data centers were a country, they would represent the world’s 12th-largest consumer of electricity, ranking somewhere between Spain and Italy.
The carbon footprint of a mid-sized, 10 MW data center can range from three million to over 130 million kilograms of CO2, according to Green House Data. However, the good news is that this environmental impact can be significantly reduced through the adoption of renewable and sustainable energy resources, such as wind energy.
According to Data Center Knowledge, the generation of power through on-site wind turbines has gained traction across the data center community over the past few years. The latest AFCOM State of the Data Center survey showed that 34 percent of respondents have either deployed or are planning to deploy a renewable energy source for their data center, of which half are or will be using wind energy. As a testament to its effectiveness, various hyper-scale organizations including Microsoft, Google and Apple are now relying upon wind energy as a source of power at some of their facilities.
View Original Sourcehttp://www.telehouse.com/2017/01/any-way-the-wind-blows/
Contact Details:
Telehouse America7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Web: www.telehouse.com
Email: gregory.grant@telehouse.com