Friday, September 22, 2017

Backup Disaster Recovery Solutions for Smart Cities


Roughly 2.3 billion connected devices exist in smart cities like Tokyo, London, and New York, a whopping 42 percent increase from last year. Therefore, disaster recovery solutions are critical. For bolstering economic growth and enhancing the quality of life, companies must have a viable IT disaster recovery plan in place.

The Internet of Things (IoT) reduces energy consumption for things like water flow regulation and street light operation. However, top disaster recovery service providers predict this footprint will exceed 50 billion connected devices by 2020, creating a tremendous challenge. It is imperative that smart cities have an IT disaster recovery plan as protection against natural and man-made incidents that would make it difficult to use accurate data required to function optimally.

In addition to the risk of being on grids for outages, smart cities get targeted for cyber-attacks. To prevent multiple city services from shutting down from a single-entry point, an all-too-real possibility that would threaten the safety and health of the public, Cloud disaster recovery becomes vital.

Along with 162,000 square feet of colocation space at a dedicated Continuity Recovery site, Telehouse New York Teleport boasts ample offices to accommodate personnel should an adverse event occur.

Friday, September 1, 2017

ALGORITHMS: SCARY SMART AND SOMETIMES JUST PLAIN SCARY

Algorithms, complex mathematical equations designed to solve problems or perform a task, automate much of the technology that makes smart cities smart, from intelligent public transportation and traffic management to smart electrical grids and water usage. Algorithms are also a fundamental tool in transforming Big Data, first into useful analytics, and eventually into action. More on that later.


Data centers and colocation facilities, the pillars of smart cities, are replete with examples of the use of algorithms. Data Center Infrastructure Management (DCIM) tools predict cooling issues based on algorithms built from temperature pattern models. There are load balancing algorithms, which play an important role in distributing network or application traffic across servers, thereby dynamically improving the efficiency of computing resources. And there are smart storage algorithms, which process requests for video and other rich media, and hold the promise of reducing energy use for enterprise-level storage area networks by 20 to 50 percent.

The world’s first and eponymously-titled Euclidean algorithm came to us in 300 B.C. and is still used by computers today. In fact, without algorithms, there would be no computer operating systems, no World Wide Web, and no Google with which to Google “algorithms,” much less the name of that actress who starred in that movie with that guy.

Okay, so now we have your attention.

Getting Too Personal

Today, algorithms are increasingly affecting our personal and professional lives in ways that we can’t imagine or might even find unsettling. Consider the algorithm created by the analytics team at the U.S. retailer Target, which could calculate whether a woman is pregnant and even when she is due to give birth.

In a nutshell, Target, like every retailer, stores a history of every item their customers have bought and any demographic information the company has collected from them. Target analyzed this information against historical buying data for all the women who had ever signed up for its baby registries. The analytics team then created an algorithm that identified 25 products — from unscented lotion to supplements such as calcium and zinc to oversized purses large enough to double as a diaper bag — which, when collectively analyzed, assigned each shopper a “pregnancy prediction” score. More importantly, for direct marketing purposes, its algorithm also estimated a woman’s due date, so that Target could send coupons to customers’ homes timed to specific stages of pregnancy.

And what could be the harm in that? Pregnancy, birth, an impending bundle of joy? Well, some women, families, and especially, teenagers, preferred that their pregnancies remained private. But Target’s predictive algorithm-based marketing hadn’t factored that very human element into their campaign, and trouble ensued.

Every Digital Breath You Take

And then of course there is the U.S. National Security Agency’s XKeyscore program, which was one of the covert projects revealed by Edward Snowden. You may never have heard of XKeyscore, but it definitely has heard of you.

XKeyscore collects every digital breath you’ve ever taken on the Internet, including browsing history, Google searches, the content of your emails and online chats, and at the tap of the keyboard, can process that data through an algorithm to identify potentially subversive activity. The NSA’s own training materials identified XKeyscore as its “widest reaching” system for developing intelligence from the Internet.  Click here to visit original source....

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Thursday, August 31, 2017

FRAILTY, THY NAME IS DATA! – MASTERING DISASTER RECOVERY


Mastering Disaster

From a busted pipe to a fire, your data is vulnerable during a disaster. This is a serious problem, considering that half of businesses that lose data for 10 days or more end up filing for bankruptcy within 6 months. You can avoid this issue by having a data center disaster recovery plan in place. The right disaster recovery strategy will safeguard your data and get you back up and running quickly.

Run Mock Drills

Your data center disaster recovery strategy must include drills. Mock drills prepare your team for a disaster so they can handle the stress during the real thing. This prevents people from panicking, ensuring that they respond effectively. The more drills you run, the better off you will be. Practice turns into habit, so you can build effective habits with mock drills.

Use Advanced Planning and a Clear Chain of Communication

Advanced planning and communication are also essential for your data center recovery. Start with a clear chain of communication so people know exactly who to communicate with during a disaster. You should run mock drills so everyone can practice communicating. In addition, update the chain when staff changes or new technology emerges.

In regards to planning, you must consider what could potentially fail during a disaster and keep spare parts on hand to fix the problem before it occurs. If an emergency occurs, your vendors won’t be able to rush out to you, so having those parts available is essential to getting up and running. In addition, keep up with your maintenance schedule to reduce the chance of failure during a disaster.  Click here to visit original source....

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Monday, August 28, 2017

WREAKING HAVOC: DDOS ATTACKS ARE GROWING MORE FREQUENT, SOPHISTICATED AND COSTLY

9:02 AM Posted by Unknown No comments

Distributed Denial-of-Service (DDoS) attacks, which do not discriminate, have been used to target financial services, healthcare, technology, media and entertainment, software, gaming and a host of other industry sectors. According to Neustar’s Worldwide DDoS Attacks and Protection Report, 73 percent of organizations have suffered a DDoS attack, and 85 percent of attacked businesses have been victims of multiple assaults. Almost half of all DDoS targets run the risk of losing more than $100,000 per hour, with one-third exposed to potential losses of more than $250,000 per hour.

How do those numbers add up over the duration of an attack? According to research by the Ponemon Institute, the average cost of a DDoS attack last year was $4.7 million. Moreover, greater than half of all targets have also suffered a cybersecurity breach while undergoing a DDoS attack.


While the number and severity of DDoS attacks has risen every year, this past year has seen the rise of mega attacks targeting major sites. Twelve attacks have been recorded achieving greater than 100 Gbps throughput, of which five exceeded 200 Gbps.

The most notorious and largest of recent attacks were executed by the Mirai botnet, a collection of more than 100,000 compromised Internet-connected video cameras, consumer routers and other devices that occurred in the fall of 2016. Mirai’s victims included the Republic of Liberia’s Internet infrastructure, as well as U.S. DNS service provider Dyn, which resulted in the inaccessibility of several high-profile websites such as Netflix, Spotify and Airbnb. When one can’t binge-watch their favorite show, listen to R&B classica, or book an affordable room in Paris or Barcelona, something has to change. More on that later.

Cybercriminals have many DDoS-enabled weapons in their quiver. While the Mirai attack directly targeted Dyn’s DNS service, DNS services can also be interrupted by a method known as spoofing, or cache poisoning, whereby corrupt DNS data is used to divert Internet traffic away from the correct server. In fact, one in five DDoS attacks last year were DNS-based.

DNS servers can also be used to generate DDoS traffic with DNS amplification. The cybercriminal sends a DNS query with a forged IP address. The DNS server responds to the forged IP address belonging to the target of the attack. The result is that small queries trigger large responses that can overwhelm the target of the attack.

So, how do companies solve the challenge of quickly detecting and mitigating the pernicious threat of DDoS attacks?  Continue reading....

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Monday, July 24, 2017

NYSCIO 2017: KEY TAKEAWAYS FROM NYSERNET’S 16TH ANNUAL EVENT

NYSERNet’s NYSCIO 2017 conference explores IT trends within higher education institutions


As discussed in our recent blog, The Impact of Technology on Higher Education, technological innovation is making an enormous impact on the higher education system as an increasing number of institutions utilize IoT, Big Data analytics, OTT streaming technologies and advanced online learning platforms to provide students, faculty and administration with an abundance of new opportunities to enhance educational experiences.




Last week, NYSERNet’s 16th annual New York State Chief Information Officer (NYSCIO) conference took place at the Harbor Hotel in Clayton, New York, a region better known as the 1000 Islands. This event brings together Chief Information Officers (CIOs) and senior IT leaders from across New York State’s higher education community, providing a platform for networking, education and business development. I had the pleasure of attending the three-day conference to not only share information about the Telehouse data center services and solutions that benefit the higher education community, but to gain a deeper understanding of the various challenges and obstacles faced by academia’s leaders and decision-makers.

While in attendance, I had the opportunity to hear about the higher education’s most pressing topics, including IT compliance issues, digital transformation, Big Data analytics and integration of IT and academics. During the event, I was also introduced to the findings of the recent Fall 2016 Campus Computing Survey during a session led by Campus Computing Project Founding Director, Casey Green, who explored the various IT priorities of New York universities and colleges.

The results of the 2016 National Survey of eLearning and Information Technology in U.S. Higher Education indicated that there are five major priorities for today’s campus administrators, including hiring and retaining qualified IT staff; assisting faculty with the instructional integration of IT; upgrading and enhancing network and data security; providing adequate user support services; and leveraging IT resources to support student success. In fact, each one of these concerns was ranked a top priority by more than 75 percent of survey responders. To read more click here....

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Friday, July 21, 2017

GLOBAL SPOTLIGHT: INTERNET AND DATA CENTER GROWTH IN RUSSIA

Multinationals Seeking a Commercial Presence in Russia Push the Market Forward



With nearly 74 million users, Russia is Europe’s largest internet market. Given that figure, the metrics surrounding Russia’s data center industry are somewhat ambiguous. Consider that the country’s commercial data center market reached a modest $303 million in 2014, but has been growing at approximately 25 percent per year over the last five years, according to Direct INFO, a research consultancy.

In fact, as recently as eight years ago there were only half a dozen Tier I to Tier II commercial data centers in the entire country and these were largely operated by systems integrators. At the time, Russia’s technology talent pool lacked the necessary skillsets to build and operate modern data centers.

Today, however, Russia has no fewer than 180 data centers, most which are in Moscow. Sixteen of the 20 largest data centers in the country operate in the capital, each of which contains more than 1,000 racks and an average total capacity of 12 MW. Over the next several years, that number is anticipated to grow due to a confluence of factors, and not just in Moscow.

Government Regulations and Global Business Drive Growth

The data center colocation market, in particular, is being stimulated by government legislation, passed in September 2015, which forbid the storage of Russian citizens’ personal data on servers located abroad. Multinational and Russian financial institutions, as well as insurance and investment companies, are also facing new, more stringent regulations on international activity, which will increase the demand for premium data center services.

The other main drivers of the Russian colocation sector include a steady rise in demand for new white space, a growing interest among Russian enterprises in outsourced data center strategies, and an increasing number of international service providers and enterprises looking to establish a commercial presence in Russia.

With the development of enterprise branch networks, it also becomes desirable for companies to centralize the processing and storage of data using complex business applications, for example, ERP-and CRM-systems. Hence, commercial data centers will increasingly be used to centralize the IT infrastructures of global companies. Moreover, the use of commercial data centers will allow multinational firms to ensure the continuity of their business due to their high reliability.

On the Edge and in the Cloud

The owners of large-scale web projects, including search engines, web portals and social networks that generate a significant amount of traffic and number of users, also seek to locate their equipment closer to the end-user, or on the edge of the network, to reduce the costs of data transfer. These web-scale players are specifically interested in regional data centers. Visit original source....

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

THE “NEW” NEW YORK

How New York Is Evolving Into a World-Class Smart City


Simon Sylvester-Chaudhuri is a Managing Partner at Global Fortunes Group, spearheading various products and programs that drive urban innovation. As an advocate for technological innovation around the world, Simon is passionate about smart city development on a global scale and has worked with multiple world-class cities throughout Europe and the Middle East. We recently had the opportunity to interview Simon to discuss the current state of New York as a smart city as well as the policies that are driving the technological advancements that will define the “new” New York.

Coming to America

While the concept of smart cities has been in practice throughout Europe for nearly a decade, this trend has only taken hold in the United States over the past two to three years across a limited number of major metro areas, including New York, Chicago, Atlanta and San Francisco. In New York, the conversion into a technologically advanced smart city is predominantly driven by government programs and citizen engagement.

“One of the key drivers of innovation that I’ve experienced in New York is the willingness of policy-makers, privately-held enterprises and general citizens to work together to create a smarter and more advanced city,” shared Simon. “We’re not only focusing on the technology aspect, but also creating new ways to engage citizens and organizations with innovation labs and government programs. These provide an element of inclusiveness that is unique to New York, enabling intelligent discussion and action.”

As a testament to that commitment, the Mayor’s Office of Technology and Innovation (MOTI) has laid the groundwork for continued innovation by providing the necessary resources for a variety of projects, including the conversion of the historic Brooklyn Navy Yard into a model of modern, urban technological development. This industrial park also serves as home to the New Lab, one of the world’s leading technology hubs.

“In the push toward technological innovation, major universities such as Cornell, Columbia, NYU and CUNY are also getting involved in a big way,” Simon added. “Universities throughout this region are driving multiple initiatives to collect data that will help develop programs for enhanced urban and scientific progress as well as sustainability. One such program is the Urban Future Lab at the NYU Tandon School of Engineering, which hosts several programs focused on education, policy and market solutions to solve the challenge of sustainability in smart cities.”  Click here to visit original source

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Friday, June 30, 2017

Telehouse Green: How Green Is My Cloud?

Telehouse-Green-Cloud

UNDERSTANDING THE ENERGY EFFICIENCY OF CLOUD-BASED COMPUTING
Forrester estimates that worldwide spending on Public Cloud computing services will grow to $160 billion in 2020, a 22 percent annual growth rate from just five years ago. And it’s not just Public Cloud that is experiencing a spike, but Private and Hybrid Cloud usage too.
Among enterprises with 1000 or more employees, Private Cloud adoption increased from 63 percent to 77 percent, and Hybrid Cloud rose from 58 percent to 71 percent from 2015 to 2016, according to RightScale’s 2016 State of the Cloud survey. Enterprises that use the Cloud are, on average, leveraging three Public Clouds and three Private Clouds, each.
Businesses are increasingly opting to switch from internal resources to cloud-based computing to enjoy benefits such as faster scalability of capacity, pay-as-you-go pricing, and access to cloud-based applications and services without the need to purchase and manage expensive on-premises infrastructure.
But whether you’re considering a Public, Private or Hybrid Cloud configuration, as-a-service computing offers another distinct advantage over on-premise alternatives: It’s comparatively greener. A study by Accenture found that for large enterprise firms, Cloud adoption can cut energy use and carbon emissions by 30 to 60 percent in comparison to on-premise IT infrastructures. And for mid-sized firms using the Cloud, carbon emissions and energy consumption can be reduced by as much as 60 to 90 percent.
Let’s examine why.
Green That Is Virtually Self-Evident
Some of the reasons why cloud-based infrastructure is greener than on-premises equipment are…well…virtually self-evident.
Virtualization, the definitive technology at play, enables a single physical server to run multiple operating system images simultaneously. Through consolidation, server virtualization reduces the total physical server footprint. Less servers mean less power consumed and a reduced carbon footprint. Also, when less equipment is required to run workloads, this reduces data center space, and with less physical equipment plugged-in, a facility will consume less electricity.
It’s interesting to note that virtualization is nothing new. In fact, IBM pioneered the concept in the 1960s, but its potential has only been fully realized with the advent of modern data center and server technologies. Visit here for original source….
Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Telehouse for Technophiles: Can Your Data Center Survive the Next Big Earthquake?

A Look at Disaster Preparedness in Los Angeles-Based Data Centers
The ground shook violently, car alarms shrieked and retail boutique windows shattered across the busy sidewalks of Hollywood Boulevard. On January 17, 1994, a 6.7-magnitude earthquake struck just 20 miles west of Los Angeles, producing the strongest seismic disturbance ever recorded in a North American city. This was the costliest natural disaster to strike the United States at the time, causing billions of dollars of structural damage and economic loss, and severely damaging hundreds of buildings throughout the Los Angeles metro area, including skyscrapers, hospitals, stadiums and apartment complexes.
Data-Center-Disaster-Preparedness.jpg
Positioned along the San Andreas Fault, California experiences 10,000 earthquakes on average every year, according to the United States Geological Survey. While most are mild enough to go undetected by the general public, roughly 15 to 20 of these earthquakes reach a magnitude greater than 4.0, thereby exposing vulnerable structures to significant damage.
In California, earthquakes aren’t a seasonal threat like hurricanes, but can strike at any time without warning. Experts predict there is a 67 percent chance of an earthquake with a magnitude of 6.7 or greater striking Los Angeles within the next 30 years.
Disaster Recovery Planning is the Key to Business Continuity
Faced with an earthquake, a company’s information may not be irretrievably lost, but without access to critical data like customer and financial records, its business operations likely won’t be able to withstand the event. An earthquake of high magnitude can easily disable an enterprise data center or colocation site through damage to the structure of the building, equipment, its ability to access power, or the many connections established within the facility.
Data center operators, particularly those in California, must have an adequate disaster recovery plan to mitigate the threat of downtime during an earthquake. Disaster preparedness in seismic-sensitive regions requires a combination of virtual and physical safeguards to ensure the facility’s continued operations. Secondary, offsite backups are a common way for data centers to prepare for disaster. By replicating data in the cloud, data center operators eliminate a single point of failure, ensuring that mission-critical information remains fully accessible. Continue reading from original source….
Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Tuesday, June 20, 2017

Disaster Recovery Solutions and Colocation Facilities Are Critical to Protecting the Data of Smart Cities


According to Gartner, the technology research and advisory firm, this year, an estimated 2.3 billion connected things will be used in smart cities, which includes major metro areas such as New York, London and Tokyo. That projection represents a 42 percent increase in the number of connected devices since 2016.
Rapid urbanization has mandated the need for smart city solutions. Experts worldwide point out that smart cities will be the future enablers in accelerating economic growth and improving the quality of life of metropolitan citizens. As we will learn later in this blog, because colocation facilities and data centers provide the backbone of smart city infrastructure, having viable disaster recovery solutions in place will become essential not only to smart, public services but for maintaining public health and safety.
Smart cities rely on interconnected devices to streamline and improve city services based on rich, real-time data. These systems combine hardware, software and geospatial analytics to enhance municipal services and improve an area’s livability. IoT-enabled sensors, for example, can reduce the energy expended in street lights or regulate the flow of water to better preserve resources.
But with the global IoT footprint expected to surpass 50 billion connected devices by 2020, smart cities will need to strengthen disaster recovery methods for both unexpected natural and man-made incidents that would adversely impact their ability to rely on accurate data to properly function. While we don’t like to think about it, some of the world’s smart cities are in low-lying coastal areas that are prone weather-related emergencies, such as flooding, or are situated on grids whose history suggests the possibility of future outages. Also, unfortunately, there is the ever-present threat of cyberattack and the destruction or sabotage of physical infrastructure.
To look at but one real world scenario, if hackers targeted a smart city’s Supervisory Control and Data Acquisition (SCADA) system, which some cyber defense experts claim are susceptible to intrusions due to poor security protocols, they could potentially shut down multiple city services from a single entry point and threaten public health and safety.
For this reason, and as more and more cities around the world adopt smart initiatives, it becomes mission-critical to make data security a priority. Just as public utilities such as power, gas and water are physically protected and secure, smart city planners need to secure data by implementing failover and backup in all systems and networks extending to the data centers that form the key infrastructure for providing IoT-enabled services. Disaster Recovery preparedness, which starts with accessible data backup, is the foundation of business and smart city continuity. Continue reading from original source….
Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Monday, May 29, 2017

Telehouse Global: The Impact of Technology on Higher Education

BUILDING A SAFER, SMARTER AND MORE CONNECTED WORLD, ONE CAMPUS AT A TIME

Higher education directly affects social mobility and economic development on a global scale. For this reason, innovative technologies enabled by advanced colocation services are being developed to make higher education more attainable, affordable and effective for students from a wide range of socio-economic backgrounds and geographic regions. Leveraging the power of the Internet of Things (IoT), Big Data analytics, OTT streaming technologies and advanced online learning platforms, institutions are providing access to an array of new opportunities for high-quality educational experiences.
Technological innovation is fundamentally changing how universities interact with students, enabling global institutions from both the public and private education sectors to adopt various trends such as adaptive learning technologies that monitor student progress, mobile applications that enable students to remotely access course material, and next-gen learning management systems that deliver a holistic view of educational development.
Once the province of for-profit institutions, online classes are now offered at top tier universities such as Harvard, Yale and Brown, as well as mid-level and community colleges worldwide. The growing popularity of online education is due in part to the ability to provide course material in a way that is not only flexible, but immersive, utilizing mobile technology, web-based video communications, and access to a seemingly endless supply of online content and resources.
These same technologies have also granted globally dispersed universities and research organizations the ability to partner and collaborate on many influential research projects. At the University of Wisconsin, for example, agriculture students and faculty work alongside various Chinese research universities and organizations to analyze environmental factors affecting the milk yield of cows and develop solutions for the advancement of the dairy industry.
Technology is also helping universities make their communities safer, smarter and more efficient. Use of IoT devices and smart technology are pervasive throughout university campuses, ranging from automated emergency alerts and outdoor Wi-Fi access points, to smart laundry facilities and responsive HVAC systems. Data collection has also opened the door to the use of advanced analytics that help university administrators better understand and satisfy the needs of their student body with technologies such as smart map apps that help them navigate the campus, and IP-enabled cameras for enhanced security. Continue reading from original source….
Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Monday, May 22, 2017

TELEHOUSE TUTELAGE: EDGE COMPUTING IN THE ERA OF IOE

Micro Data Centers and Colocation Providers Will Enable a Future of Ubiquitous Connectivity

Driverless cars, drone-powered product delivery, and remotely monitored, environmentally controlled doghouses are but a few examples of the wondrous Internet of Everything. For the uninitiated, the Internet of Everything, or IoE, builds on the foundation of the Internet of Things (IoT) by adding network intelligence that allows convergence, orchestration and visibility across previously disparate systems. As we will learn further on, both micro data centers and colocation providers will play an integral role in enabling a future of ubiquitous connectivity.

One can envision the IoT as the equivalent of a railroad, including the tracks and connections, whereas the IoE is the railway line, as well as the connected trains, rail weather monitoring systems and sensors, departures and arrivals board, and even staff and customers. The Internet of Everything connects all these separate “things” into one cohesive whole, enabling these IoT-enabled devices and connected humans to communicate and share data with each other in real time.
Metaphors aside, the enormity of on-demand connectivity, compute, networking and storage necessary to enable the IoE will be challenging. Research firm Gartner forecasts that 8.4 billion connected things will be in use worldwide by the end of the year, up 31 percent from 2016, and reach 20.4 billion by 2020.
Considered a direct outcome of the growing interest in IoT and IoE, edge computing brings bandwidth-intensive content and latency-sensitive applications closer to the user or data source. With edge computing, IT professionals can provide data processing power at the edge of a network instead of maintaining it in a Cloud. Because of the need for high-speed computing — which, for example, would be critical for a driverless car to be able to stop for traffic signs and avoid fender benders — edge computing is considered more reliable than Cloud computing.
While much information will still be uploaded and processed through the Cloud, some applications will demand ultra-fast access to data, requiring the use of physical infrastructure that is closer to the edge versus where the data is centrally stored. However, as information is exchanged between more local and centralized data center facilities, one must consider the challenges that will emerge as a consequence. These include possible service disruption as well as latency and network reliability issues.
The reality is that to enable the IoE many organizations will deliver specific IoT applications and services from a variety of data centers ranging from smaller in-house networking facilities to large colocation data centers. And this will have implications for the overall levels of resilience and security that will be expected.
Large data centers and colocation facilities have the highest standards for such functions as data backup, failover systems and physical security. Backups are performed regularly and there is ample storage and server redundancy, enhanced by virtualization, in the event of equipment failure. Highly redundant power and cooling systems are de rigueur, and physical security is strictly enforced to ensure no unauthorized access. Click here to visit original source...
Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Tuesday, May 16, 2017

TELEHOUSE GLOBAL SPOTLIGHT: OTT VIEWERSHIP IS FAST BECOMING OVER THE TOP

Colocation Provides a Solution to OTT Performance


Over-the-Top (OTT), in telecom parlance, refers to an app or service that delivers content such as streaming video and audio over the internet rather than traditional cable or satellite distribution. According to the 2017 OTT Video Services Study conducted by Level 3 Communications, viewership of OTT video services, including Netflix, Hulu and Amazon Prime, will overtake traditional broadcast TV within the next five years. Meanwhile, Juniper Research predicts that the global OTT market will increase to $18 billion in 2019, up from $8.5 billion just three years ago.

Additionally, it’s worthy of note that the audience for OTT content is growing not only in total viewership and revenue, but geographically. Last year, Netflix tripled its global reach by expanding into an additional 130 countries as the video streaming service took its most aggressive step yet in its plans for international growth.

The reason for the surge in OTT viewership lies in immediate gratification: People want what they want when they want it. OTT allows viewers to consume content whenever and wherever they desire on their preferred device. Particularly for millennials, appointment TV is now widely considered a legacy entertainment model.

Supporting the increasing volume of streaming video requires solutions to the hosting, delivery, bandwidth and performance challenges that all too frequently frustrate the Quality-of-Service and experience of online video viewers. Whether at the source or along the last mile, insufficient bandwidth creates interruptions that result in dreaded buffering pauses. Content providers address bandwidth challenges by compressing data and bringing content closer to users by placing the data on edge servers in strategically located data centers and colocation facilities around the world. However, in order for OTT players to successfully reach their audience, it’s critical to collocate within data centers capable of providing low-latency connectivity to end users throughout their target geographic regions. Click here to visit original source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Wednesday, May 10, 2017

SMART CITIES: ENABLING A BETTER QUALITY OF LIFE

Defining the Phenomenon Sweeping the World’s Major Metropolitan Areas


Toward the end of the last decade, the number of humans inhabiting the Earth crossed the seven billion mark, with the majority of people living in metropolitan areas. According to the World Health Organization, urban residents account for 54 percent of the total global population – a number that is expected to grow nearly two percent each year until 2020. As a result, it’s become critical to establish greener and more efficient technology in major metropolises.

Once the realm of science-fiction, today, “smart cities” are being established around the world, transforming how we live through the use of innovative technology and analytics. A smart city is defined by the integration of information and communication technology (ICT) and IoT devices to manage a variety of public assets, while acquiring critical data on an ongoing basis to improve the lives of its citizens.

According to research firm Frost and Sullivan, there are eight elements that comprise a smart city, including smart governance and education, smart healthcare, smart building, smart mobility, smart infrastructure, smart technology, smart energy and smart citizen. Cities that successfully integrate at least five of these eight markers receive the distinction of being a smart city. In addition, Frost and Sullivan estimates a combined market potential of $1.5 trillion globally across these various smart city categories.

The core focus of the world’s smart cities is to enable a better quality of life for all residents and visitors. Utilizing smart technology, officials gain enhanced visibility into the inner workings of the city on a granular level, enabling them to identify services needed by citizens. For example, in New York City, the LinkNYC program is in the process of transforming 7,500 former payphones into digital hubs, providing the public with free Wi-Fi, charging ports, and access to directions and city services. Boston has implemented solar-powered benches that not only allow users to charge their mobile devices, but send environmental data to government officials via advanced sensors in hopes of improving the city’s livability. Click here to visit original source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Friday, April 28, 2017

Strong Economy and Strict Privacy Laws Make Frankfurt a Global Epicenter for Colocation

After the Second World War, Frankfurt am Main was rebuilt and soon emerged as a leading financial and commercial hub in West Germany. The city experienced strong economic development due to its central position on the Main River and its expansion into neighboring domestic markets.


Fast-forward to present day and Frankfurt, now a bustling international metropolis and the financial capital of Europe, is still experiencing strong growth, particularly on the digital front. Germany is now one of the four leading colocation markets in Europe, and the largest population of its data centers can be found around the city of Frankfurt where the majority of internet traffic from Germany and many other countries is routed.

The Place to Be in Germany

Two decades ago, Frankfurt had a reputation for bing a some what lackluster metropolis. But now, the city —referred to as “Mainhattan” for its downtown skyscrapers— is on a cultural, technological and economic upswing, and rapidly becoming a top destination for colocation providers.

Frankfurt also plays host to a thriving startup community and the second-largest internet exchange in Europe, DE-CIX, with over 500 ISPs and carriers. Startup growth around the Frankfurt region is occurring at a rate of 22 percent annually, while the rest of Germany is hovering around 13 percent. Meanwhile, its financial technology industry is second only to the UK in terms of overall investment and German fintech business is expected to top $2 billion by 2020.
For more detail visit here...

Contact Us:

Telehouse America
7 Teleport Drive
Staten Island
New York
Canada, 10311
cac@telehouse.com
718-355-2500

Friday, April 21, 2017

TELELHOUSE TUTELAGE: PEERING 101

Understanding Network Interconnectivity

http://www.telehouse.com/solutions/connectivity/peering/

Peering, simply defined, is the interconnection of two different networks, allowing them to directly exchange traffic between one another, and organizations to reach each other’s customers. Public peering is performed across a shared network and relies upon Internet Exchanges (IXs) to function and deliver content across the world. An Internet Exchange is an Ethernet switch or set of Ethernet switches in a colocation facility, to which all networks peering in the facility can connect. Using an IX, a network can cost-effectively peer with hundreds of other networks through a single connection.

Private peering within a colocation facility involves two networks putting routers in the same building and running a direct cable between them rather than connecting via the exchange point switch. This is common when the networks are exchanging a large volume of traffic that won’t fit on a shared connection to an exchange point.

Most major population centers have an Internet Exchange. Because of the significant network density available in these key locations, a variety of businesses, including cloud and content providers, financial services companies, global enterprises and public sector agencies choose to deploy their infrastructure within these facilities. This allows them to leverage the direct interconnection options available by cross-connecting with multiple network providers. Peering arrangements need to be negotiated with each peer, but no new cabling needs to be deployed, unlike private peering. Visit Original Source....


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Thursday, April 13, 2017

TELEHOUSE GLOBAL SPOTLIGHT: SOFTWARE-DEFINED NETWORKING AND THE DATA CENTER

Enhancing Connectivity for the Globalized Economy

Global Data Centers

As enterprises both large and small become increasingly globalized, expanding their businesses across cities, countries and even continents, their networks must grow with them. Software-Defined Networking addresses the fact that the static architecture of conventional networks has become ill-suited to the computing and storage needs of today’s global data center environments and the organizations they serve.

Software-Defined Networking (SDN) is an emerging architecture that is adaptable, manageable and cost-effective, making it ideal for the dynamic, high-bandwidth nature of today’s applications. This architecture decouples the network control and forwarding functions, enabling the network control to become directly programmable, and the underlying infrastructure to be abstracted for applications and network services. SDN facilitates the deployment of applications that make it easier for a widely-dispersed, global workforce to communicate and collaborate with each other.

Some of the key computing trends driving the need for SDN include the rise of cloud services, Big Data, and the Bring Your Own Device (BYOD) trend. Moreover, applications that commonly access geographically distributed databases and servers through public and private clouds require extremely flexible traffic management and access to bandwidth on demand – something that SDN delivers. SDN restores control of the network to the network administrator, enabling a company to scale its network based on its own considerations, rather than based on existing vendor solutions. It provides more flexibility in configuring network traffic flow, better monitoring and smoother removal of inefficiencies and bottlenecks that would affect performance. Visit Original Source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Wednesday, April 12, 2017

TELEHOUSE FOR TECHNOPHILES: THE LINK BETWEEN CLOUD ADOPTION AND CONNECTIVITY

How Surging Cloud Use is Making Connectivity a Differentiator for Data Centers

Telehouse Cloud Adoption

It’s safe to say that the global tech forecast is cloudy, and getting cloudier.

Consider this: According to the Cisco Global Cloud Index, global IP traffic will account for more than 92 percent of total global data center traffic by 2020.  In addition, cloud data center traffic for consumer and business applications will grow at a Compound Annual Growth Rate (CAGR) of 30 percent over the next three years, and 68 percent of cloud workloads will be processed by public cloud data centers – a 49 percent increase from 2015.

This migration to cloud computing can largely be attributed to performance-driven enterprises’ growing use of cloud-based applications. In one recent study conducted by Skyhigh Networks that surveyed various IT decision-makers, 79 percent of respondents claimed that they receive regular requests from end-users each month to buy more cloud applications. Among these applications, communication and collaboration via video, file and content sharing, and social media topped the list of the most frequently requested capabilities. Original source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA
Zip Code: 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Tuesday, March 28, 2017

TELEHOUSE, THE HUMAN ELEMENT: A VIEW FROM THE BRIDGE

Interview with Akihiko Yamaguchi, EVP and CMO of KDDI America and COO of Telehouse America


Aki Yamaguchi is the Executive Vice President and Chief Marketing Officer of KDDI America and Chief Operating Officer of Telehouse America. We recently had the opportunity to interview Mr. Yamaguchi and discuss his background, the impact of Big Data and the Internet of Things (IoT) on data centers, as well as the current state of the colocation market.

From E-mail to IoT

Interestingly enough, Mr. Yamaguchi, who’s worked with the KDDI Group for over 26 years, did not originally set out to start his professional career in the technology industry.

“To be perfectly honest, at the beginning of my career I was not at all interested in any of the technical disciplines,” he shared with us. “I studied English literature and was attracted to business as an opportunity to advance my language skills and interact with other professionals from all over the world.”

After joining KDDI, Mr. Yamaguchi quickly developed an affinity for the telecommunications sector and was struck by the nature of its continuously developing innovations.

“The telecom industry has been growing very quickly and things shift rapidly,” he noted. “Technologies get old after six months or so, and I was very attracted to the dynamic changes ones sees happening throughout the industry every day.”

Looking back, Mr. Yamaguchi can still recall the initial impact of email as a means for business development and customer relations.

“It’s a funny thing,” he stated, recalling the early days of widespread internet access and email. “When business shifted from simple handwriting to personal computers and e-mail correspondence, I would often call clients immediately after sending an email for fear that it wouldn’t be received. It truly was a dramatic change for many professionals.” Read more visit original source...

Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Friday, March 24, 2017

TELEHOUSE GLOBAL SPOTLIGHT: TOKYO, THE HEART OF JAPAN’S DIGITAL AWAKENING

Driven by Global Enterprises, Tokyo Has Become Asia’s Largest Colocation Market

Telehouse Tokyo Data Centers

The telecommunications market of Japan is among the largest, most profitable and most advanced in the world. While Japan was initially slow to introduce the internet and broadband access, today the country has more broadband subscribers than all of Europe combined. In fact, driven by the demand of high-speed internet and mobility services, the Japanese telecom industry is on track to become one of the most developed global markets.

The growth of the Japan’s telecom industry can be attributed to the burgeoning middle-class and the increased interest of leading global enterprises in establishing a presence there. Recognizing the importance of enhancing the Information and Communications Technology (ICT) sector across the country to improve social and commercial development, the Japanese government has taken active steps to develop its nascent digital economy, including a more liberalized approach to foreign investments and programs to encourage technological innovation.

At the heart of Japan’s digital awakening, there is a growing demand for data center space in the country’s major metro areas, especially Tokyo, spurred in part by the need to accommodate the expansion of leading multinationals’ business across the island nation. The greater Tokyo metropolitan area, the most populous in the world, has a population of approximately 35 million. View Original Source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Tuesday, March 21, 2017

TELEHOUSE GREEN: INNOVATION THROUGH COLLABORATION

How the Open Compute Project Is Transforming Data Center Infrastructure and Hardware

Innovation through Collaboration

As Albert Einstein once stated, “Problems cannot be solved by the same level of thinking that created them.”

In the data center and colocation industry, where copious amounts of energy used to power critical infrastructure cause significant strain on natural resources and the bottom line of facility owners and operators, the need for a new level of thinking has become an existential requirement. To meet this challenge, the data center community has been forced to shift its longstanding and entrenched perspective on hardware and infrastructure to become more dynamic, inventive and holistic in its approach to design.

Enter the Open Compute Project, which was inspired by the creativity and collaboration exemplified by open source software. The Open Compute Project officially launched in 2011 when Facebook decided to share its design for the world’s most energy-efficient data center with the public. Soon after, Intel®, Rackspace, Goldman Sachs and Andy Bechtolsheim, the electrical engineer who co-founded Sun Microsystems and later became an early investor in Google, enlisted their support.

The mission of the Open Compute Project is based on a simple, yet powerful concept. Members of this community believe that openly sharing ideas, specifications and intellectual property is the key to maximizing innovation and reducing complexity in the tech components needed to support the growing demands on compute infrastructure. Today, with hundreds of participants actively collaborating, the Open Compute Project is transforming data center infrastructure and hardware design with a focus on energy and material efficiencies.

The Data Center: A Single, Ubiquitous Ecosystem

While traditional data center design often occurs in isolated components such as the building, servers and software, by contrast, the Open Compute Project evaluates the collective influence of all components within the data center environment. This unique approach to viewing the data center as a single, ubiquitous ecosystem leads to optimized energy and material use, as well as reduced environmental impact. Three core aspects of the Open Compute Project’s approach to data center infrastructure and hardware include enhanced rack design, localized back-up power and evolved machinery. View Original Source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Wednesday, March 15, 2017

TELEHOUSE PREPARES FOR TUESDAY WINTER STORM, STELLA 3/14/2017

Please be advised: Telehouse Facilities Department has been actively tracking the Nor’easter Stella and is taking precautionary measures to mitigate the risk of the storm predicted to hit New York and the New England area Monday evening into Tuesday evening.

Staten Island’s 7 Teleport and Chelsea 85 10th Avenue both have been notified of the potential blizzard conditions and will be maintaining consistent operation for our customers throughout the upcoming days.

At this time, Telehouse has taken proactive measures for this work and has made accommodations for additional coverage with engineers and operation technicians in the event that storm conditions hit the tristate area.

Telehouse has already conducted full inspections of all critical equipment including UPS, generator, chiller, and switch gear to be in normal working conditions before the weekend. We have also prepared spare parts kits for emergency equipment as a proactive measure. In addition, our fueling companies and vendors have been notified to place their teams on a stand by condition. To know more click here ...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com

Wednesday, February 22, 2017

TELEHOUSE FOR TECHNOPHILES: THE BRAIN IN THE MACHINE

How Data Centers Are Using Deep Learning

Machine and deep learning, which emerged from Artificial Intelligence (AI), the theory and development of computer systems that can perform tasks normally requiring human intelligence, mimics activities in layers of neurons in the neocortex, the area of the brain where thinking occurs. Deep learning software can be programmed to recognize patterns in digital representations of sounds, images and other data. In fact, machine intelligence is transforming the future of everything from communications to healthcare, and from manufacturing and transportation to advanced robotics. Writers and filmmakers such as Arthur C. Clarke and Stephen Spielberg have foretold of a brave new world where AI will take one day influence every waking aspect of our personal and professional lives.

Science-fiction aside, machine learning is already well-established in our everyday world, from your faithful companion Siri to facial recognition programs to language translation. But it can also help to tackle some of the world’s most challenging industrial problems, such as rampant energy consumption that adversely impacts the environment. Large-scale commercial systems, including high performance data centers, consume a lot of energy, and while much has been done to mitigate energy usage in enterprise and colocation facilities, deep learning can do much more to manage the world’s increasing need for high performance computing power. Curious to know more view original source...


Contact Details:
Telehouse America
7 Teleport Drive,
Staten Island,
New York, USA 10311
Phone No: 718–355–2500
Email: gregory.grant@telehouse.com