Showing posts with label data center colocation facility. Show all posts
Showing posts with label data center colocation facility. Show all posts

Friday, June 30, 2017

Telehouse Green: How Green Is My Cloud?

Telehouse-Green-Cloud

UNDERSTANDING THE ENERGY EFFICIENCY OF CLOUD-BASED COMPUTING
Forrester estimates that worldwide spending on Public Cloud computing services will grow to $160 billion in 2020, a 22 percent annual growth rate from just five years ago. And it’s not just Public Cloud that is experiencing a spike, but Private and Hybrid Cloud usage too.
Among enterprises with 1000 or more employees, Private Cloud adoption increased from 63 percent to 77 percent, and Hybrid Cloud rose from 58 percent to 71 percent from 2015 to 2016, according to RightScale’s 2016 State of the Cloud survey. Enterprises that use the Cloud are, on average, leveraging three Public Clouds and three Private Clouds, each.
Businesses are increasingly opting to switch from internal resources to cloud-based computing to enjoy benefits such as faster scalability of capacity, pay-as-you-go pricing, and access to cloud-based applications and services without the need to purchase and manage expensive on-premises infrastructure.
But whether you’re considering a Public, Private or Hybrid Cloud configuration, as-a-service computing offers another distinct advantage over on-premise alternatives: It’s comparatively greener. A study by Accenture found that for large enterprise firms, Cloud adoption can cut energy use and carbon emissions by 30 to 60 percent in comparison to on-premise IT infrastructures. And for mid-sized firms using the Cloud, carbon emissions and energy consumption can be reduced by as much as 60 to 90 percent.
Let’s examine why.
Green That Is Virtually Self-Evident
Some of the reasons why cloud-based infrastructure is greener than on-premises equipment are…well…virtually self-evident.
Virtualization, the definitive technology at play, enables a single physical server to run multiple operating system images simultaneously. Through consolidation, server virtualization reduces the total physical server footprint. Less servers mean less power consumed and a reduced carbon footprint. Also, when less equipment is required to run workloads, this reduces data center space, and with less physical equipment plugged-in, a facility will consume less electricity.
It’s interesting to note that virtualization is nothing new. In fact, IBM pioneered the concept in the 1960s, but its potential has only been fully realized with the advent of modern data center and server technologies. Visit here for original source….
Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Telehouse for Technophiles: Can Your Data Center Survive the Next Big Earthquake?

A Look at Disaster Preparedness in Los Angeles-Based Data Centers
The ground shook violently, car alarms shrieked and retail boutique windows shattered across the busy sidewalks of Hollywood Boulevard. On January 17, 1994, a 6.7-magnitude earthquake struck just 20 miles west of Los Angeles, producing the strongest seismic disturbance ever recorded in a North American city. This was the costliest natural disaster to strike the United States at the time, causing billions of dollars of structural damage and economic loss, and severely damaging hundreds of buildings throughout the Los Angeles metro area, including skyscrapers, hospitals, stadiums and apartment complexes.
Data-Center-Disaster-Preparedness.jpg
Positioned along the San Andreas Fault, California experiences 10,000 earthquakes on average every year, according to the United States Geological Survey. While most are mild enough to go undetected by the general public, roughly 15 to 20 of these earthquakes reach a magnitude greater than 4.0, thereby exposing vulnerable structures to significant damage.
In California, earthquakes aren’t a seasonal threat like hurricanes, but can strike at any time without warning. Experts predict there is a 67 percent chance of an earthquake with a magnitude of 6.7 or greater striking Los Angeles within the next 30 years.
Disaster Recovery Planning is the Key to Business Continuity
Faced with an earthquake, a company’s information may not be irretrievably lost, but without access to critical data like customer and financial records, its business operations likely won’t be able to withstand the event. An earthquake of high magnitude can easily disable an enterprise data center or colocation site through damage to the structure of the building, equipment, its ability to access power, or the many connections established within the facility.
Data center operators, particularly those in California, must have an adequate disaster recovery plan to mitigate the threat of downtime during an earthquake. Disaster preparedness in seismic-sensitive regions requires a combination of virtual and physical safeguards to ensure the facility’s continued operations. Secondary, offsite backups are a common way for data centers to prepare for disaster. By replicating data in the cloud, data center operators eliminate a single point of failure, ensuring that mission-critical information remains fully accessible. Continue reading from original source….
Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Monday, May 22, 2017

TELEHOUSE TUTELAGE: EDGE COMPUTING IN THE ERA OF IOE

Micro Data Centers and Colocation Providers Will Enable a Future of Ubiquitous Connectivity

Driverless cars, drone-powered product delivery, and remotely monitored, environmentally controlled doghouses are but a few examples of the wondrous Internet of Everything. For the uninitiated, the Internet of Everything, or IoE, builds on the foundation of the Internet of Things (IoT) by adding network intelligence that allows convergence, orchestration and visibility across previously disparate systems. As we will learn further on, both micro data centers and colocation providers will play an integral role in enabling a future of ubiquitous connectivity.

One can envision the IoT as the equivalent of a railroad, including the tracks and connections, whereas the IoE is the railway line, as well as the connected trains, rail weather monitoring systems and sensors, departures and arrivals board, and even staff and customers. The Internet of Everything connects all these separate “things” into one cohesive whole, enabling these IoT-enabled devices and connected humans to communicate and share data with each other in real time.
Metaphors aside, the enormity of on-demand connectivity, compute, networking and storage necessary to enable the IoE will be challenging. Research firm Gartner forecasts that 8.4 billion connected things will be in use worldwide by the end of the year, up 31 percent from 2016, and reach 20.4 billion by 2020.
Considered a direct outcome of the growing interest in IoT and IoE, edge computing brings bandwidth-intensive content and latency-sensitive applications closer to the user or data source. With edge computing, IT professionals can provide data processing power at the edge of a network instead of maintaining it in a Cloud. Because of the need for high-speed computing — which, for example, would be critical for a driverless car to be able to stop for traffic signs and avoid fender benders — edge computing is considered more reliable than Cloud computing.
While much information will still be uploaded and processed through the Cloud, some applications will demand ultra-fast access to data, requiring the use of physical infrastructure that is closer to the edge versus where the data is centrally stored. However, as information is exchanged between more local and centralized data center facilities, one must consider the challenges that will emerge as a consequence. These include possible service disruption as well as latency and network reliability issues.
The reality is that to enable the IoE many organizations will deliver specific IoT applications and services from a variety of data centers ranging from smaller in-house networking facilities to large colocation data centers. And this will have implications for the overall levels of resilience and security that will be expected.
Large data centers and colocation facilities have the highest standards for such functions as data backup, failover systems and physical security. Backups are performed regularly and there is ample storage and server redundancy, enhanced by virtualization, in the event of equipment failure. Highly redundant power and cooling systems are de rigueur, and physical security is strictly enforced to ensure no unauthorized access. Click here to visit original source...
Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Friday, April 21, 2017

TELELHOUSE TUTELAGE: PEERING 101

Understanding Network Interconnectivity

http://www.telehouse.com/solutions/connectivity/peering/

Peering, simply defined, is the interconnection of two different networks, allowing them to directly exchange traffic between one another, and organizations to reach each other’s customers. Public peering is performed across a shared network and relies upon Internet Exchanges (IXs) to function and deliver content across the world. An Internet Exchange is an Ethernet switch or set of Ethernet switches in a colocation facility, to which all networks peering in the facility can connect. Using an IX, a network can cost-effectively peer with hundreds of other networks through a single connection.

Private peering within a colocation facility involves two networks putting routers in the same building and running a direct cable between them rather than connecting via the exchange point switch. This is common when the networks are exchanging a large volume of traffic that won’t fit on a shared connection to an exchange point.

Most major population centers have an Internet Exchange. Because of the significant network density available in these key locations, a variety of businesses, including cloud and content providers, financial services companies, global enterprises and public sector agencies choose to deploy their infrastructure within these facilities. This allows them to leverage the direct interconnection options available by cross-connecting with multiple network providers. Peering arrangements need to be negotiated with each peer, but no new cabling needs to be deployed, unlike private peering. Visit Original Source....


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Thursday, April 13, 2017

TELEHOUSE GLOBAL SPOTLIGHT: SOFTWARE-DEFINED NETWORKING AND THE DATA CENTER

Enhancing Connectivity for the Globalized Economy

Global Data Centers

As enterprises both large and small become increasingly globalized, expanding their businesses across cities, countries and even continents, their networks must grow with them. Software-Defined Networking addresses the fact that the static architecture of conventional networks has become ill-suited to the computing and storage needs of today’s global data center environments and the organizations they serve.

Software-Defined Networking (SDN) is an emerging architecture that is adaptable, manageable and cost-effective, making it ideal for the dynamic, high-bandwidth nature of today’s applications. This architecture decouples the network control and forwarding functions, enabling the network control to become directly programmable, and the underlying infrastructure to be abstracted for applications and network services. SDN facilitates the deployment of applications that make it easier for a widely-dispersed, global workforce to communicate and collaborate with each other.

Some of the key computing trends driving the need for SDN include the rise of cloud services, Big Data, and the Bring Your Own Device (BYOD) trend. Moreover, applications that commonly access geographically distributed databases and servers through public and private clouds require extremely flexible traffic management and access to bandwidth on demand – something that SDN delivers. SDN restores control of the network to the network administrator, enabling a company to scale its network based on its own considerations, rather than based on existing vendor solutions. It provides more flexibility in configuring network traffic flow, better monitoring and smoother removal of inefficiencies and bottlenecks that would affect performance. Visit Original Source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Tuesday, March 28, 2017

TELEHOUSE, THE HUMAN ELEMENT: A VIEW FROM THE BRIDGE

Interview with Akihiko Yamaguchi, EVP and CMO of KDDI America and COO of Telehouse America


Aki Yamaguchi is the Executive Vice President and Chief Marketing Officer of KDDI America and Chief Operating Officer of Telehouse America. We recently had the opportunity to interview Mr. Yamaguchi and discuss his background, the impact of Big Data and the Internet of Things (IoT) on data centers, as well as the current state of the colocation market.

From E-mail to IoT

Interestingly enough, Mr. Yamaguchi, who’s worked with the KDDI Group for over 26 years, did not originally set out to start his professional career in the technology industry.

“To be perfectly honest, at the beginning of my career I was not at all interested in any of the technical disciplines,” he shared with us. “I studied English literature and was attracted to business as an opportunity to advance my language skills and interact with other professionals from all over the world.”

After joining KDDI, Mr. Yamaguchi quickly developed an affinity for the telecommunications sector and was struck by the nature of its continuously developing innovations.

“The telecom industry has been growing very quickly and things shift rapidly,” he noted. “Technologies get old after six months or so, and I was very attracted to the dynamic changes ones sees happening throughout the industry every day.”

Looking back, Mr. Yamaguchi can still recall the initial impact of email as a means for business development and customer relations.

“It’s a funny thing,” he stated, recalling the early days of widespread internet access and email. “When business shifted from simple handwriting to personal computers and e-mail correspondence, I would often call clients immediately after sending an email for fear that it wouldn’t be received. It truly was a dramatic change for many professionals.” Read more visit original source...

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Friday, March 24, 2017

TELEHOUSE GLOBAL SPOTLIGHT: TOKYO, THE HEART OF JAPAN’S DIGITAL AWAKENING

Driven by Global Enterprises, Tokyo Has Become Asia’s Largest Colocation Market

Telehouse Tokyo Data Centers

The telecommunications market of Japan is among the largest, most profitable and most advanced in the world. While Japan was initially slow to introduce the internet and broadband access, today the country has more broadband subscribers than all of Europe combined. In fact, driven by the demand of high-speed internet and mobility services, the Japanese telecom industry is on track to become one of the most developed global markets.

The growth of the Japan’s telecom industry can be attributed to the burgeoning middle-class and the increased interest of leading global enterprises in establishing a presence there. Recognizing the importance of enhancing the Information and Communications Technology (ICT) sector across the country to improve social and commercial development, the Japanese government has taken active steps to develop its nascent digital economy, including a more liberalized approach to foreign investments and programs to encourage technological innovation.

At the heart of Japan’s digital awakening, there is a growing demand for data center space in the country’s major metro areas, especially Tokyo, spurred in part by the need to accommodate the expansion of leading multinationals’ business across the island nation. The greater Tokyo metropolitan area, the most populous in the world, has a population of approximately 35 million. View Original Source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Tuesday, March 21, 2017

TELEHOUSE GREEN: INNOVATION THROUGH COLLABORATION

How the Open Compute Project Is Transforming Data Center Infrastructure and Hardware

Innovation through Collaboration

As Albert Einstein once stated, “Problems cannot be solved by the same level of thinking that created them.”

In the data center and colocation industry, where copious amounts of energy used to power critical infrastructure cause significant strain on natural resources and the bottom line of facility owners and operators, the need for a new level of thinking has become an existential requirement. To meet this challenge, the data center community has been forced to shift its longstanding and entrenched perspective on hardware and infrastructure to become more dynamic, inventive and holistic in its approach to design.

Enter the Open Compute Project, which was inspired by the creativity and collaboration exemplified by open source software. The Open Compute Project officially launched in 2011 when Facebook decided to share its design for the world’s most energy-efficient data center with the public. Soon after, Intel®, Rackspace, Goldman Sachs and Andy Bechtolsheim, the electrical engineer who co-founded Sun Microsystems and later became an early investor in Google, enlisted their support.

The mission of the Open Compute Project is based on a simple, yet powerful concept. Members of this community believe that openly sharing ideas, specifications and intellectual property is the key to maximizing innovation and reducing complexity in the tech components needed to support the growing demands on compute infrastructure. Today, with hundreds of participants actively collaborating, the Open Compute Project is transforming data center infrastructure and hardware design with a focus on energy and material efficiencies.

The Data Center: A Single, Ubiquitous Ecosystem

While traditional data center design often occurs in isolated components such as the building, servers and software, by contrast, the Open Compute Project evaluates the collective influence of all components within the data center environment. This unique approach to viewing the data center as a single, ubiquitous ecosystem leads to optimized energy and material use, as well as reduced environmental impact. Three core aspects of the Open Compute Project’s approach to data center infrastructure and hardware include enhanced rack design, localized back-up power and evolved machinery. View Original Source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com