Showing posts with label data center colocation. Show all posts
Showing posts with label data center colocation. Show all posts

Wednesday, August 29, 2018

Fault Tolerance and Fault Avoidance: Looking Beyond Data Center Tiers


As the general argument goes, the fault tolerance of a tier 4 data center may be overkill for all but the most mission-critical applications of the largest enterprise. When it comes time for a business to decide, maybe the perspectives should shift to the equal need for fault avoidance.

According to the accepted Uptime Institute standard, tier 4 data center specifications call for two parallel power and cooling systems with no single point of failure (also known as 2N). While this level of fault tolerance often comes at a premium price, many enterprises see the security, reliability and redundancy as being worth it to ensure the drop in potential downtime over a tier 3 data center.

This single point of failure for any and all components is certainly nothing to scoff at when it comes to the performance of the computer equipment. Knowing that a planned approach to anytime compute component removal that foregoes compute system disruption is a major plus. But even with the understanding that comes from reading a comprehensive data center tier level guide, it becomes apparent that thinking should go beyond the tier levels to a colocation data center’s ability to provide fault avoidance.

Fault avoidance is all about the fact that many complications that lead to data center downtime can be prevented with equipment and systems monitoring, a proactive trained staff with thorough procedures, and strict maintenance protocols. In other words, fault tolerance while important is reactive where fault avoidance focuses on prevention, which is equally important.

Whether it is a tier 4 data center or a tier 3 data center, enterprises should be looking closely at these other fault avoidance parameters and systems. For instance, does the facility utilize a sophisticated and proven building management system (BMS) and building automation system (BAS)? These crucial systems allow operators to monitor systems for health status of data center equipment through gathered equipment sensor data for real-time insights. The collected data can then be used to deliver an automated response or direct proactive technician intervention.

Since we have yet to reach the ideal of the truly automated data center, highly skilled operations teams must work in tandem with the systems to anticipate problems before they occur and quickly troubleshoot issues when they do arise. Visit source for more details.

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

The Essential Role of Colocation Data Centers for IoT and Big Data

For startups and enterprises, data center colocation has become a major part of the business in the digital age where IoT is ubiquitous across every sector and big data is now just data. The main reason for this is that for most businesses, the IoT frameworks goes far beyond the reach of the local data center with an ever-expanding network edge of sensors that stretch across a city and even the world.

Big Data’s impact on the data center is far reaching since achieving low cost and low latency application performance is imperative with IoT-driven businesses. This is especially true as more and more of this IoT data processing is getting pushed out to the edge to get as close as possible to the source sensors and end-users of the resulting data analytics. Consequently, today’s data center colocation providers can offer the best means for filling the gap in IoT’s edge computing landscape while offering a cost-effective means for managing, storing, and organizing big data.



While the cloud is also a major part of that IoT/big data world, businesses require the means for gaining instantaneous access, fast data transport, and needed compute resources that are reliable. Of course, technology and cost needs associated with moving massive amounts of data into the cloud is not the best strategy when latency and accessibility are driving IoT and big data for a business.

Effective IoT and the resultant big data being delivered from sensors require the shortest possible distance between sensors, data analytics applications, and the end-users of the processed data. Data center colocation providers can effectively serve IoT framework needs by delivering an abundance of options including major cloud providers and broad peering options among others.

Colocation becomes the most efficient and flexible means to manage and analyze the enormous amounts of IoT sensor data for factories, supply chains, power grids, distributed products and even cities. Click here to visit original source.

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Monday, July 23, 2018

The Benefits of Data Center Network Flexibility with NaaS

12:54 AM Posted by Unknown , No comments

Enterprises are well aware of the benefits of a Tier 3 data center as part of a secure and agile hybrid and multicloud strategy that ensures uptime, flexibility, and cost containment. But today, as more enterprises are seeing the need for diverse cloud connectivity to meet application access demands, network as a service (NaaS) is playing a part in expanding that scope.

There are a number of benefits that come from (NaaS), but chief among them is its ability to provide enterprises with the means for on-demand provisioning and management of the network. This drives efficient expansion, management and cost containment by providing variable network connectivity to adapt to network load requirements. This level of network flexibility as part of a cloud strategy makes it easier for businesses to add and reconfigure resources quickly and meet fluctuating network transport needs based on real-time utilization.

Data centers like Telehouse New York that partner with NaaS providers can deliver connectivity options into an SD-WAN framework that is managed by the service provider. By enabling network management and provisioning via a web interface, enterprises can lower the growing costs of management and configuration hardware through the service provider’s SD-WAN software.

These services add a great deal of value to enterprises that require Tier 3 data center services.

The variable network connectivity for both the cloud access and cloud backbone networks of NaaS becomes equally important to the power redundancy and added security benefits of Tier 3 data center specifications. According to the 2018 TechTarget IT Priorities survey where 42% of respondents are using cloud-based SaaS offerings, streamlined network management and monitoring have become a priority.

The ability to partner with a Tier 3 data center that can enable true connectivity flexibility via a cloud access network that enables workload bursting and balancing via NaaS helps keep costs in hand while enabling organizations to tailor network and workloads for peak efficiency and performance.

As a result, in-house data centers can be seamlessly connected to collocation or managed services facilities and to on-demand cloud data centers for a multi-site, hybrid data center model. Click here to visit original source.

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Tuesday, March 27, 2018

Solutions for Disaster Recovery that Protect Smart Cities

Based on a statement from Gartner, a technology research and advice firm, there are roughly 2.3 billion connected things smart cities such as New York, Tokyo, and London use. Compared to 2016, that number represents a 42 percent increase. Soon, smart cities will be the catalyst behind an economic boom and improved quality of life for people living in them.

As the backbone of smart cities, it is imperative that data centers and colocation sites have the right disaster recovery solutions in place. Not only will this ensure flawless connectivity and top data security but also public health and safety.

To streamline city services, smart cities rely on rich data in real time. Software, hardware, and geospatial analytics can improve on livability and municipal services. With enhanced sensors, the Internet of Things (IoT) can reduce the amount of energy consumed by street lights and preserve resources by regulating water flow.

Due to the location of many smart cities, as well as other potential risks, disaster recovery cloud services are vital. Disaster recovery providers protect power and communication caused by power outages, floods, and even cyber attacks. To continue reading and visit source click here.

Thursday, February 15, 2018

Telehouse Introduces Data Center Robotics

data center colocation


The term “robot” translates in Czech to “hard work” or “drudgery.” With advances in technology, data center colocation services include robotics as part of specialized applications that reduce human labor. Primarily, data center colocation providers deploy robotics to enhance efficiency. Facing fierce competition, businesses continually search for ways to make their infrastructures less expensive and agiler. Robotics reduce IT staff, which ensures greater monitoring accuracy and improved security.

Both EMC and IBM currently rely on iRobot Create, which traverses data center colocation facilities to check for fluctuations in temperature, humidity, and system vibrations. After the robot scours a data center colocation site for the source of vulnerabilities, like cooling leaks, it gathers data for processing through a Wi-Fi connection. An algorithm converts the data into a thermal map so that managers can identify anomalies.

Still in the concept phase, PayPerHost is working on Robonodes, which would replace a failed customer server or storage node. Sony and Facebook rely on robotic units as part of Blu-ray disc-based media storage archives. Overall, robotics help businesses mitigate the footprint of data center managed services while simplifying infrastructure.

Telehouse is responding to the increased demand for cloud computing and technological advances. Someday, data center resilience and archiving efficiency will improve due to more robust systems, automation software, and intense planning.

Thursday, January 18, 2018

Algorithms: Smart Yet Slightly Frightening

Colocation hosting providers


In smart cities, as well as data center and colocation facilities, algorithms play a critical role. Algorithms are the reason computer operating systems exist and, therefore, the World Wide Web and Google. For a colocation provider, algorithms make it possible to provide customers a safe and reliable service.

Algorithms also help transform Big Data, initially converting it into analytics and then into an action. Colocation service providers are at the heart of smart cities, with algorithms assisting Data Center Infrastructure Management (or DCIM) tools in predicting cooling problems.

Load balancing algorithms are critical for colocation services, distributing application or network traffic across servers, thereby making them more efficient. There are also smart storage algorithms that process rich media requests, including videos, and cut energy consumption for enterprise-level storage area networks by as much as 50 percent.

In unimaginable ways, algorithms impact both personal and professional lives, which is exciting, yet somewhat unnerving. As an increasing number of businesses adopt and enhance digital solutions, there is a strong chance of seeing more colocation service providers relying on algorithms for storage, computing, and networking.

For organizations with business-critical data, Telehouse provides superior colocation services with 48 data centers worldwide. Ultimately, business owners have peace of mind thanks to high security, redundant power, and flawless interconnection to virtually hundreds of service providers.

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Friday, September 1, 2017

ALGORITHMS: SCARY SMART AND SOMETIMES JUST PLAIN SCARY

Algorithms, complex mathematical equations designed to solve problems or perform a task, automate much of the technology that makes smart cities smart, from intelligent public transportation and traffic management to smart electrical grids and water usage. Algorithms are also a fundamental tool in transforming Big Data, first into useful analytics, and eventually into action. More on that later.


Data centers and colocation facilities, the pillars of smart cities, are replete with examples of the use of algorithms. Data Center Infrastructure Management (DCIM) tools predict cooling issues based on algorithms built from temperature pattern models. There are load balancing algorithms, which play an important role in distributing network or application traffic across servers, thereby dynamically improving the efficiency of computing resources. And there are smart storage algorithms, which process requests for video and other rich media, and hold the promise of reducing energy use for enterprise-level storage area networks by 20 to 50 percent.

The world’s first and eponymously-titled Euclidean algorithm came to us in 300 B.C. and is still used by computers today. In fact, without algorithms, there would be no computer operating systems, no World Wide Web, and no Google with which to Google “algorithms,” much less the name of that actress who starred in that movie with that guy.

Okay, so now we have your attention.

Getting Too Personal

Today, algorithms are increasingly affecting our personal and professional lives in ways that we can’t imagine or might even find unsettling. Consider the algorithm created by the analytics team at the U.S. retailer Target, which could calculate whether a woman is pregnant and even when she is due to give birth.

In a nutshell, Target, like every retailer, stores a history of every item their customers have bought and any demographic information the company has collected from them. Target analyzed this information against historical buying data for all the women who had ever signed up for its baby registries. The analytics team then created an algorithm that identified 25 products — from unscented lotion to supplements such as calcium and zinc to oversized purses large enough to double as a diaper bag — which, when collectively analyzed, assigned each shopper a “pregnancy prediction” score. More importantly, for direct marketing purposes, its algorithm also estimated a woman’s due date, so that Target could send coupons to customers’ homes timed to specific stages of pregnancy.

And what could be the harm in that? Pregnancy, birth, an impending bundle of joy? Well, some women, families, and especially, teenagers, preferred that their pregnancies remained private. But Target’s predictive algorithm-based marketing hadn’t factored that very human element into their campaign, and trouble ensued.

Every Digital Breath You Take

And then of course there is the U.S. National Security Agency’s XKeyscore program, which was one of the covert projects revealed by Edward Snowden. You may never have heard of XKeyscore, but it definitely has heard of you.

XKeyscore collects every digital breath you’ve ever taken on the Internet, including browsing history, Google searches, the content of your emails and online chats, and at the tap of the keyboard, can process that data through an algorithm to identify potentially subversive activity. The NSA’s own training materials identified XKeyscore as its “widest reaching” system for developing intelligence from the Internet.  Click here to visit original source....

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Friday, July 21, 2017

GLOBAL SPOTLIGHT: INTERNET AND DATA CENTER GROWTH IN RUSSIA

Multinationals Seeking a Commercial Presence in Russia Push the Market Forward



With nearly 74 million users, Russia is Europe’s largest internet market. Given that figure, the metrics surrounding Russia’s data center industry are somewhat ambiguous. Consider that the country’s commercial data center market reached a modest $303 million in 2014, but has been growing at approximately 25 percent per year over the last five years, according to Direct INFO, a research consultancy.

In fact, as recently as eight years ago there were only half a dozen Tier I to Tier II commercial data centers in the entire country and these were largely operated by systems integrators. At the time, Russia’s technology talent pool lacked the necessary skillsets to build and operate modern data centers.

Today, however, Russia has no fewer than 180 data centers, most which are in Moscow. Sixteen of the 20 largest data centers in the country operate in the capital, each of which contains more than 1,000 racks and an average total capacity of 12 MW. Over the next several years, that number is anticipated to grow due to a confluence of factors, and not just in Moscow.

Government Regulations and Global Business Drive Growth

The data center colocation market, in particular, is being stimulated by government legislation, passed in September 2015, which forbid the storage of Russian citizens’ personal data on servers located abroad. Multinational and Russian financial institutions, as well as insurance and investment companies, are also facing new, more stringent regulations on international activity, which will increase the demand for premium data center services.

The other main drivers of the Russian colocation sector include a steady rise in demand for new white space, a growing interest among Russian enterprises in outsourced data center strategies, and an increasing number of international service providers and enterprises looking to establish a commercial presence in Russia.

With the development of enterprise branch networks, it also becomes desirable for companies to centralize the processing and storage of data using complex business applications, for example, ERP-and CRM-systems. Hence, commercial data centers will increasingly be used to centralize the IT infrastructures of global companies. Moreover, the use of commercial data centers will allow multinational firms to ensure the continuity of their business due to their high reliability.

On the Edge and in the Cloud

The owners of large-scale web projects, including search engines, web portals and social networks that generate a significant amount of traffic and number of users, also seek to locate their equipment closer to the end-user, or on the edge of the network, to reduce the costs of data transfer. These web-scale players are specifically interested in regional data centers. Visit original source....

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Monday, May 22, 2017

TELEHOUSE TUTELAGE: EDGE COMPUTING IN THE ERA OF IOE

Micro Data Centers and Colocation Providers Will Enable a Future of Ubiquitous Connectivity

Driverless cars, drone-powered product delivery, and remotely monitored, environmentally controlled doghouses are but a few examples of the wondrous Internet of Everything. For the uninitiated, the Internet of Everything, or IoE, builds on the foundation of the Internet of Things (IoT) by adding network intelligence that allows convergence, orchestration and visibility across previously disparate systems. As we will learn further on, both micro data centers and colocation providers will play an integral role in enabling a future of ubiquitous connectivity.

One can envision the IoT as the equivalent of a railroad, including the tracks and connections, whereas the IoE is the railway line, as well as the connected trains, rail weather monitoring systems and sensors, departures and arrivals board, and even staff and customers. The Internet of Everything connects all these separate “things” into one cohesive whole, enabling these IoT-enabled devices and connected humans to communicate and share data with each other in real time.
Metaphors aside, the enormity of on-demand connectivity, compute, networking and storage necessary to enable the IoE will be challenging. Research firm Gartner forecasts that 8.4 billion connected things will be in use worldwide by the end of the year, up 31 percent from 2016, and reach 20.4 billion by 2020.
Considered a direct outcome of the growing interest in IoT and IoE, edge computing brings bandwidth-intensive content and latency-sensitive applications closer to the user or data source. With edge computing, IT professionals can provide data processing power at the edge of a network instead of maintaining it in a Cloud. Because of the need for high-speed computing — which, for example, would be critical for a driverless car to be able to stop for traffic signs and avoid fender benders — edge computing is considered more reliable than Cloud computing.
While much information will still be uploaded and processed through the Cloud, some applications will demand ultra-fast access to data, requiring the use of physical infrastructure that is closer to the edge versus where the data is centrally stored. However, as information is exchanged between more local and centralized data center facilities, one must consider the challenges that will emerge as a consequence. These include possible service disruption as well as latency and network reliability issues.
The reality is that to enable the IoE many organizations will deliver specific IoT applications and services from a variety of data centers ranging from smaller in-house networking facilities to large colocation data centers. And this will have implications for the overall levels of resilience and security that will be expected.
Large data centers and colocation facilities have the highest standards for such functions as data backup, failover systems and physical security. Backups are performed regularly and there is ample storage and server redundancy, enhanced by virtualization, in the event of equipment failure. Highly redundant power and cooling systems are de rigueur, and physical security is strictly enforced to ensure no unauthorized access. Click here to visit original source...
Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Friday, April 21, 2017

TELELHOUSE TUTELAGE: PEERING 101

Understanding Network Interconnectivity

http://www.telehouse.com/solutions/connectivity/peering/

Peering, simply defined, is the interconnection of two different networks, allowing them to directly exchange traffic between one another, and organizations to reach each other’s customers. Public peering is performed across a shared network and relies upon Internet Exchanges (IXs) to function and deliver content across the world. An Internet Exchange is an Ethernet switch or set of Ethernet switches in a colocation facility, to which all networks peering in the facility can connect. Using an IX, a network can cost-effectively peer with hundreds of other networks through a single connection.

Private peering within a colocation facility involves two networks putting routers in the same building and running a direct cable between them rather than connecting via the exchange point switch. This is common when the networks are exchanging a large volume of traffic that won’t fit on a shared connection to an exchange point.

Most major population centers have an Internet Exchange. Because of the significant network density available in these key locations, a variety of businesses, including cloud and content providers, financial services companies, global enterprises and public sector agencies choose to deploy their infrastructure within these facilities. This allows them to leverage the direct interconnection options available by cross-connecting with multiple network providers. Peering arrangements need to be negotiated with each peer, but no new cabling needs to be deployed, unlike private peering. Visit Original Source....


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Tuesday, March 28, 2017

TELEHOUSE, THE HUMAN ELEMENT: A VIEW FROM THE BRIDGE

Interview with Akihiko Yamaguchi, EVP and CMO of KDDI America and COO of Telehouse America


Aki Yamaguchi is the Executive Vice President and Chief Marketing Officer of KDDI America and Chief Operating Officer of Telehouse America. We recently had the opportunity to interview Mr. Yamaguchi and discuss his background, the impact of Big Data and the Internet of Things (IoT) on data centers, as well as the current state of the colocation market.

From E-mail to IoT

Interestingly enough, Mr. Yamaguchi, who’s worked with the KDDI Group for over 26 years, did not originally set out to start his professional career in the technology industry.

“To be perfectly honest, at the beginning of my career I was not at all interested in any of the technical disciplines,” he shared with us. “I studied English literature and was attracted to business as an opportunity to advance my language skills and interact with other professionals from all over the world.”

After joining KDDI, Mr. Yamaguchi quickly developed an affinity for the telecommunications sector and was struck by the nature of its continuously developing innovations.

“The telecom industry has been growing very quickly and things shift rapidly,” he noted. “Technologies get old after six months or so, and I was very attracted to the dynamic changes ones sees happening throughout the industry every day.”

Looking back, Mr. Yamaguchi can still recall the initial impact of email as a means for business development and customer relations.

“It’s a funny thing,” he stated, recalling the early days of widespread internet access and email. “When business shifted from simple handwriting to personal computers and e-mail correspondence, I would often call clients immediately after sending an email for fear that it wouldn’t be received. It truly was a dramatic change for many professionals.” Read more visit original source...

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Friday, March 24, 2017

TELEHOUSE GLOBAL SPOTLIGHT: TOKYO, THE HEART OF JAPAN’S DIGITAL AWAKENING

Driven by Global Enterprises, Tokyo Has Become Asia’s Largest Colocation Market

Telehouse Tokyo Data Centers

The telecommunications market of Japan is among the largest, most profitable and most advanced in the world. While Japan was initially slow to introduce the internet and broadband access, today the country has more broadband subscribers than all of Europe combined. In fact, driven by the demand of high-speed internet and mobility services, the Japanese telecom industry is on track to become one of the most developed global markets.

The growth of the Japan’s telecom industry can be attributed to the burgeoning middle-class and the increased interest of leading global enterprises in establishing a presence there. Recognizing the importance of enhancing the Information and Communications Technology (ICT) sector across the country to improve social and commercial development, the Japanese government has taken active steps to develop its nascent digital economy, including a more liberalized approach to foreign investments and programs to encourage technological innovation.

At the heart of Japan’s digital awakening, there is a growing demand for data center space in the country’s major metro areas, especially Tokyo, spurred in part by the need to accommodate the expansion of leading multinationals’ business across the island nation. The greater Tokyo metropolitan area, the most populous in the world, has a population of approximately 35 million. View Original Source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Tuesday, March 21, 2017

TELEHOUSE GREEN: INNOVATION THROUGH COLLABORATION

How the Open Compute Project Is Transforming Data Center Infrastructure and Hardware

Innovation through Collaboration

As Albert Einstein once stated, “Problems cannot be solved by the same level of thinking that created them.”

In the data center and colocation industry, where copious amounts of energy used to power critical infrastructure cause significant strain on natural resources and the bottom line of facility owners and operators, the need for a new level of thinking has become an existential requirement. To meet this challenge, the data center community has been forced to shift its longstanding and entrenched perspective on hardware and infrastructure to become more dynamic, inventive and holistic in its approach to design.

Enter the Open Compute Project, which was inspired by the creativity and collaboration exemplified by open source software. The Open Compute Project officially launched in 2011 when Facebook decided to share its design for the world’s most energy-efficient data center with the public. Soon after, Intel®, Rackspace, Goldman Sachs and Andy Bechtolsheim, the electrical engineer who co-founded Sun Microsystems and later became an early investor in Google, enlisted their support.

The mission of the Open Compute Project is based on a simple, yet powerful concept. Members of this community believe that openly sharing ideas, specifications and intellectual property is the key to maximizing innovation and reducing complexity in the tech components needed to support the growing demands on compute infrastructure. Today, with hundreds of participants actively collaborating, the Open Compute Project is transforming data center infrastructure and hardware design with a focus on energy and material efficiencies.

The Data Center: A Single, Ubiquitous Ecosystem

While traditional data center design often occurs in isolated components such as the building, servers and software, by contrast, the Open Compute Project evaluates the collective influence of all components within the data center environment. This unique approach to viewing the data center as a single, ubiquitous ecosystem leads to optimized energy and material use, as well as reduced environmental impact. Three core aspects of the Open Compute Project’s approach to data center infrastructure and hardware include enhanced rack design, localized back-up power and evolved machinery. View Original Source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com