Sprint to Shut Down WiMAX Service ~Nov 6, 2015

Overview:

At long last, Sprint has confirmed it will be turning off of its WiMAX service on or around November 6th, 2015.  That’s a little over 1 year for owners of WiMAX mobile devices (including laptops with WiMax cards/dongles) to switch to LTE .  WiMAX mobile devices will still work in 3G mode after that, but one doesn’t buy a 4G device or interface to use it in 3G mode, especially after a few years of ownership.

The time frame for the shutdown was originally reported by a blog on Android Central this past weekend, which cited an internal Sprint memo detailing the company’s network plans. Yesterday, Sprint spokeswoman Adrienne Norton confirmed the date to FierceWireless.

In April, Sprint said in a Securities and Exchange Commission filing that it would “cease using WiMAX technology  by the end of 2015.” As part of that effort, Sprint said it identified  approximately 6,000 “redundant sites that we expect to decommission and  terminate the underlying leases.”

This author long ago forecast the demise of mobile WiMAX, because no other major wireless telcos or mobile device makers (except Samsung) were supporting it.  In a September 2006 blog post, we asked “Will the Real “Mobile WiMAX” please stand up!”

What’s Cooking Now in Sprint’s Wireless Broadband Kitchen:

Sprint says it already has built its 4G LTE network to serve more than 255 million people, with more on the way, making it easy for customers to take advantage of all the great features on smartphones.

In addition to Sprint’s all-new 3G network and its 4G LTE network, Sprint is deploying Sprint Spark, a technology designed to greatly improve the performance of video and other bandwidth-intensive applications, including new generations of online gaming, virtual reality and advanced cloud services. It enables stutter-free video chat on-the-go and mobile gaming that leaves lag behind. Sprint Spark is an enhanced LTE service that’s built for data and designed to deliver average wireless speeds of 6-15Mbps and peak wireless speeds of 50-60Mbps today on capable devices, with increasing speed potential over time. Sprint plans to reach 100 million Americans by year-end with the service.

Beginning Oct. 10, the Sprint Business Share Plan will again double the data, now delivering 240GB to 800GB of data to business customers with 50 to 100 lines.  The latest promotion, running through Oct. 31, 2014, provides businesses double the data for $50 to $150 less per month than a similar promotion from AT&T.

The Road Ahead will be Very Bumpy:

A recent SEC regulatory filing confirms mass layoffs at Sprint.  With its latest round of layoffs and cost-cutting, it will be difficult for the smaller size company to expand and upgrade it’s LTE infrastructure and continue on the M2M/IoT path it had set for itself over four years ago.

Google’s Actions Hurt Broadband Rollout, Particularly in Rural America

Lack of relevant content, affordability and digital literacy are the three dominant barriers to the adoption of broadband, according to the World Bank Broadband Strategies Toolkit. The Google Apps Partner Edition™ platform addresses these barriers as it is a relatively low-cost, email/communications/productivity/device management platform. The Google platform has become a platform trusted by tens of millions to manage the cyber aspects of their businesses and their lives, including this publication which has used variations of it since late 2006.

A recent decision by Google that unilaterally changes the way it deals with ISP partnerships jeopardizes that trust, while putting up new barriers to broadband and have some suggesting that Google’s actions are at the heart of the issues that have been brought forward in the Senate Judiciary Committee Net Neutrality hearings. Assuming the ISPs can continue with some variation of the existing program, the approximate ten times cost increase cost makes cable programming price hikes trivial by comparison.

A Brilliant Partnership Strategy

An ISP brochure featuring the benefits of Google Apps.
An ISP brochure featuring the benefits of Google Apps Partner Edition.

To help drive adoption of its platform, Google engaged ISPs and Communications Service Providers, many of which are non-profit, member-owned entities, to be value-added resellers of their Google Apps platform. These operators transitioned their customers from self-hosted or third-party email solutions to the Google Apps platform.

This approach of working with local partners allowed Google to focus on creating and keeping the platform relevant, as indicated in this post from this 2007 Google Blog post by Google product manager, Hunter Middleton:

“From the beginning, we envisioned making Google Apps available to any organization that might want to offer this innovative set of services to its employees, customers, students, members, or any other associates of the organization. Today, we’re excited to take another step in that direction by releasing a version of Google Apps specifically designed for ISPs, portals, and other service providers, whether you have a few thousand subscribers or over a million. This new version, which we’re calling the Partner Edition, makes it easy for large and small service providers to offer your subscribers the latest versions of powerful tools, like Gmail, Google Calendar, and Google Docs & Spreadsheets, without having to worry about hosting, updating, or maintaining any of the services yourself. All you have to do is point and click in the easy admin control panel and figure out what branding you’d like to layer on top of the products in order to create a customized look and feel. You can quit spending your resources and time on applications like webmail — and leave the work to our busy bees at the Googleplex.”

And the strategy worked, as hundreds of ISPs, representing what some estimate to be approximately one million subscribers signed up. By being able to private label the service, the ISPs could offer their customers state-of-the-art, ad-free email, contacts, chat, calendars, online documents, photos and more at an affordable price without having to maintain a costly infrastructure. BEVCOMM”s CEO Bill Eckles stated how it made a difference to his company and  community:

 “BEVCOMM has been using their [Google’s] platform through a 3rd party integrator for a couple of years.  We switched from managing our email platform in-house to Google’s platform because of their reputation for being incredibly reliable.  As a very small company we simply didn’t have the resources to manage a platform ourselves with the reliability people demand from email.”

With Google focusing on software, the ISP’s could focus on educating customers on how to use the many features inherent in the Google Apps products. These customers included not only residential, but schools and small businesses who appreciated the assistance that only an operator with local presence can provide. These ISPs essentially serve as the local, outsourced IT staff, freeing up resources of the small businesses to focus on their services and products.

In an email exchange, Kurt Gruendling, VP of Marketing for WCVT, a family owned, Vermont rural communications company explained how he and his colleague taught classes to more than 1,000 customers:

“I’m a big advocate for the platform and I think the Google Apps platform is very powerful and have spent a considerable amount of time teaching “Google School” classes to our customers many of whom would have never used Google Apps.  We have taught over 50 on site classes and webinars throughout our territory over the past 18 months with more than 1,000 customers attending and leading them to the value of the core services in the platform that go way beyond mail.”

Kurt Gruendling teaching how to use Google Apps to rural customers
Kurt Gruendling teaching rural customers how to use Google Apps.

In addition to its investment in marketing and training, the ISP pays Google for the ongoing cost to maintain the service. The costs are variable, which is a benefit to the ISP, as they don’t have to make large upfront investments in servers and software. This allows the ISP to focus on providing higher-speed and more reliable bandwidth to their customers by bringing fiber deeper into the last-mile network, which is important for the cloud-based, Google Apps platform. Ironically, many of these rural operators deployed Fiber to the Home (FTTH) bef ore Google’s move into that business in certain urban markets, such as Kansas City, Austin and Provo.

For those operators that were too small to deal directly with Google, at least two Google Apps Partner Edition™ integrators emerged to help; Ikano and NeoNova, a subsidiary of the National Rural Telecommunications Cooperative. These aggregators not only helped the operators, but they helped get the word out about the Google Apps Partner Edition™ and aggressively marketed it and successfully signed up ISPs throughout the country, such  as the aforementioned WCVT

10X Increase in Cost – If You Can Get It

Things were going swimmingly for the operators and their customers, then, in early 2014, operators found out that Google has plans to significantly alter its relationship with ISPs in 2015. This was a surprise and is causing operators to scramble to come up with alternative solutions. The solutions aren’t going to be simple or inexpensive, however.

One person commenting with a Google Forum suggested that he sees the following alternatives as summarized below:

  1. Audit our email accounts to reduce the number as much as possible. Migrate the accounts to Google Apps for Business, which would be about 10 times the cost [from $0.35 to approx. $3-$4 per month that this operator is currently paying Google] per email account. Pass this cost on to consumers and prepare for a backlash.
  2. Migrate to another solution

Based on comments from operators both in private exchanges with this author and on the aforementioned Google Forum, it is unclear whether Google Apps for Business will even be available to all ISPs.

The increased cost cited above actually is much higher on a per household basis, as the costs are on a per email account basis, so the cost to the ISP for a typical residential broadband account with 5 included email accounts would jump to $16.50 per month.

The other alternative is to switch providers. There are rumblings that some providers are looking at alternatives, such as Atmail (which interestingly just opened a U.S. office), [added 10/16/14] Hyperoffice, Microsoft Office 365, [Added 10/17/14] OX, Zimbra or Zoho. Still, there are concerns about the migration and, as WCVT’s Gruendling points out, it isn’t clear whether all of a customers “paid for” content will successfully transition, as there isn’t a one-for-one replacement for Google’s excellent product.

“There is a date when Google will delete end-user data, including documents, pictures, videos and paid-for content.”

An ISP promoting Google Apps with a prominent place on their web site.
An ISP prominently promoting Google Apps on its web site.

In addition to the higher direct ongoing costs that, in the end, will be passed on to the consumer, there will also be the costs associated with making the transition. One forum commenter suggesting it could take 1 to 2 staff to focus on the transition. This is a real opportunity cost for operators with a small staffs and it means they have fewer resources for helping customers and building their broadband networks.

Bill Eckles, CEO of MN-based, rural telecommunications provider, BEVCOMM, suggested that experience suggests the transition will be challenging:

“Now that Google has decided they don’t want to maintain this program we are going to be forced to move all of our customers. The markets BEVCOMM serves are generally more rural and less affluent than [those served by] larger companies.  Every time we switch email platforms it is a major undertaking trying to support our customers through the move.  The last time we switched platforms 3,000 email customers opted to switch to another option not offered by BEVCOMM.”

It will be a hassle for the consumers, as it will mean reconfiguration of mail clients on PCs, phones and tablets. Special care will be necessary for the ISP’s business customers who are dealing with confidential information and regulatory compliance issues (e.g. HIPAA).

And since this transition includes documents, photos and paid-for content, it will be even more complex than the transition that the ISPs made to move their existing email systems to Google’s platforms. And many of these providers are still recovering from that move, which, for some, occurred less than a year ago.

Google – Rural America Is Calling

In a letter to Senator Leahy, WCVT characterized this as an issue of rural/urban and big/small and that the conversation will cause a multi-million dollar support issue that will affect the operators, as well as Google.

Bill Eckles of Bevcomm reinforced WCVT’s comments when he stated in an email:

“Switching email platforms is a major disruption for our customers, who really at the end of the day are Google’s customers. As far as I know Google didn’t even bother asking for input from any of the companies who are offering their email.  This experience has really shown Google doesn’t seem to care about rural consumers.”

A WCVT service vehicle in rural Vermont about to cover a historic covered bridge.
A WCVT service vehicle in rural Vermont about to cover a historic covered bridge.

One of the big frustrations for the ISPs is that they haven’t been given a reason the program, which these  ISPs want to keep, is being unilaterally changed by Google. They have made suggestions for compromise solutions, but have been unable to discuss with any of the Google staff responsible for the offering.

This action also seemingly runs counter to Google’s initiatives to make broadband ubiquitous, as its ISP partners are deploying Fiber to the Home in rural markets where Google will not be able to reach with its last-mile, fiber network.

WCVT, in its letter to Senator Leahy, provides a voice to its rural ISP brethren with its desire to meet with Google.

“We are requesting a meeting with Google decision-makers with authority and vision to establish a reasonable course of action. We need the chance, on behalf of our consumers, to sit down and discuss the impacts of this decision and to seek to work with Google to find alternative solutions rather than just having the plug pulled on us.”

These providers want to work with Google and one service provider holds out hope that those who came to rely on Google to provide critical content for their broadband networks will be able to come up with a solution that works for all parties:

“There has to be a better solution. We are committed to working with Google and hope that they don’t turn their back on rural America.”

[Editor’s Note: At the time of publication, Google has not yet provided an official response to this article, explaining why they changed the Google Apps Partner Edition™ program.]

[Added 03/25/15 & 10/16/14] Links to exclusive ViodiTV interviews with operators discussing the impact of Google’s actions on their operations and their customers:

Catherine Moyer, GM of Pioneer Communications, discussing their use of Google Apps at the 2014 NTCA Fall Conference.
Donnie Miller explains how Pioneer Telephone was blindsided by Google.
Catherine Moyer, GM of Pioneer Communications, discussing their use of Google Apps at the 2014 NTCA Fall Conference.
Catherine Moyer, GM of Pioneer Communications, discusses the impact of Google’s actions.
Kurt Gruendling of WCVT displays the Smart Rural Community road sign symbolizing the award his company received from NTCA at their 2014 Fall Conference.
Kurt Gruendling, WCVT VP of Marketing and Business Development discusses impact of Google’s actions.

Rural America Needs Advanced Services & Competition – Part 1

The above title is by no means an original thought; this belief has been documented in policy statements and legislation at the Federal and individual state level many times over the years.  Technology and other social factors have resulted in many changes to telecommunications policy for the common good of our country.  A significant number of the social goals and policy revisions were initiated to ensure that people, communities and economic development in Rural America were not left behind and to ensure that all Americans would realize the benefits from advancements in technology.  Today, our government, via laws, rules and policies have stipulated that open & fair competition is right for all of America.  But before we discuss the current landscape, a brief review of how we got here would be helpful.

Bell System hires 1921 logo
Image courtesy of Wikimedia Commons

Alexander Graham Bell patented the telephone in 1876; American Telephone & Telegraph (AT&T) and the associated Bell companies were formed and became known as the Bell System.  Generally speaking the local Bell companies provided the local connections in mostly larger cities and AT&T provided the long distance lines.  In addition to the Bell System, various non-bell telephone companies (independent or cooperative companies) were created to offer local service where the Bell companies were not currently offering service; these mostly rural areas were in some cases farms, ranches, very small towns and even resort communities.

[Because there was some concern that it was not an attractive economic decision to build facilities to serve rural areas, the government stepped in to offer financial assistance to ensure quality service was offered in rural America.  The Rural Electrification Administration (REA), now the Rural Utilities Service (RUS), was created to assist the interests of consumers and communities in areas where adequate service was not provided.  Because this requirement for financial assistance in rural areas continues today, the RUS remains a vital part of economic development in rural America.

FelicityStreetTelephonePoles1909
Lots of Wires, Image courtesy of Wikimedia Commons
One of the earliest involvements of the government in the provision of telephone service was a determination that the American common good would be best served by determining that the telephone service marketplace should not be open to all competitors in all areas.  Telephone service at that time was provided by above ground wire and poles; policy makers were concerned about safety, economic and environmental impacts of numerous above ground facilities in urban areas and very little service in less populated areas.

This concern was highlighted by the strong belief that many companies would flock to more densely populated areas creating havoc for consumers; but few, if any, companies would want to offer serve in less populated more costly rural areas.

For these and other reasons, it was determined that the common good would be best served by replacing open “competition” for the provision of telephone service with “regulation” and by establishing local monopoly franchised service areas.  One company was given the right to be the only company in a designated franchised area to offer telephone service in exchange for operating under the rules and regulations (including the rates for services) of a regulatory organization; a federal agency for the services between the states and state agencies operating in each state jurisdiction.

This “regulated one-company monopoly” model appeared to satisfy consumers for a number of years; but advancements in technology (i.e., new products and services) were extremely slow.  Only the limited products and services offered by the company providing telephone service in that area were available to those consumers.                        

Another major development in government involvement was a policy that became known as “Universal Service”; which was an understanding that the common good for all America was best served when more telephones were on the network.  The idea…every new telephone line added to the network increased the value of every other telephone line on the network (the Network Effect).  This proposition resulted in revised rate structures, revenue sharing between companies and subsidies between services.  It led the way to subsidies between toll & local service, business & residential services and urban & rural services.  These defined subsidies, along with government financial assistance, fostered the expanded economic growth and consumer satisfaction in rural areas throughout America.  Telephone service penetration grew substantially throughout the country.

Throughout the earliest stages of the development of telephone service in this country, policy makers continued to believe that a regulated monopoly with no or limited competition was beneficial for the American consumers.  However, some consumers and other groups began to believe that this type of structure or model was stifling the development of new products and services.  These groups started to expound on the idea that, as long as a company was “guaranteed” all customers in a given area, that company would do very little to satisfy customer demand in that area.  It was clear that consumers wanted new products and services…they wanted choice and competitors wanted to enter the market.  These underlying beliefs brought about a national interest in revisiting the real benefits of a regulated monopoly for telephone service.

Carterfone cradle at CHM.agr
Carterfone adaptor made land-wireless communications possible. Image Courtesy of Wikimedia Commons
It began in the consumer product arena.  The regulated companies had stipulated that only their company provided equipment could be connected to the public switched network.  However, many new products (such as The Hush-a-Phone and, Carterfone) were being introduced into the market and consumer demand appeared to be growing.

The regulated companies argued that the newer products would harm the public network…the use of one of these products by one customer could have an adverse impact on the service of another customer.  After many court and regulatory battles, “customer provided equipment” was permitted, connection rules (including jacks) were established and telephone service went on with little or no problems.  The fear that competition would denigrate the network was unfounded and the consumers began to enjoy a plethora of many new end-user type products.

The interests of all American consumers started to become the primary measuring stick utilized by policy makers & regulators when discussing issues and making decisions affecting the provision of telecommunications.

The next area ripe for the introduction of competition was the long distance segment of the telecommunications marketplace; but it proved to be much more complex, requiring more time to implement.  It started slow, in the specialized services arena.  Some new companies (such as MCI, SBS, etc.) were offering private services to companies with large communications needs.

These companies wanted to offer any spare capacity to the general public.  The regulated monopolies objected; they believed that the public offering of this type of communications service was their exclusively their business and these upstart companies should cease offering these competitive services.  Again, numerous court and regulatory battles ensued and the policy makers wrestled with the balance of competition versus regulation.

It was clear that the true benefits (economic and quality) of technological advancements could only be realized if these new market entrants with alternative products and services were permitted to participate in the marketplace….for all consumers…in all areas.

One particular court battle was a very long antitrust action against AT&T and the Bell companies.  After extensive arguments, the parties agreed to an unprecedented settlement of monumental proportions for the telecommunications world.  It was announced on January 8, 1982 and scheduled to become effective January 1, 1984.  It would take almost two years to carry out the stipulated agreement, which was outlined in a “Plan of Reorganization” which outlined the corporate separation; distribution of assets; defining service areas, interconnection arrangements; assignment of personnel; establish contracts; etc.

An image depicting the 7 Baby Bells that were formed with the divestiture of AT&T.
AT&T Splits Into Long Distance and the “Baby Bells” Image Courtesy of Cellstream

In summary, it separated this enormous American corporate giant along the lines of what was perceived as competitive activities (AT&T long distance) and what activities that, at that time, were not considered to be competitive (the local Bell operating companies).  AT&T with a few subsidiaries. such as Bell Labs, Western Electric, etc. (that were more aligned with a competitive activity) went in one direction.  And the twenty-two local bell Operating companies were separated and reorganized into seven Regional Bell Companies providing local telephone exchange service in specific areas.

AT&T was immediately thrown into a competitive world, having to formally “interconnect” with its earlier corporate subsidiaries and others.  The Bell Operating companies would continue their local regulated monopoly services with all the requirements (Carrier of Last Resort, etc.) associated with that position.  But the handwriting was on the wall…..the word of the day was competition and even the individual Regional Bell Companies soon looked at the other regions (and everyone else in the marketplace) as competitors.

Bottom line; the telecommunications world was turned upside down.  Local Access and Tandem Areas (LATAs) were designed….“Access Charges” was added to the industry glossary of terms… Organizations were created, such as NECA, ECSA, etc., to assist in the management of interconnecting carriers.  The complexity of this event was compounded due to the number of other participating carriers & companies, such as the 1,000+ non-Bell independent and cooperative exchange companies, the new long-distance carriers, wireless providers and the equipment manufacturers.

This far-reaching action was another step in what was an obvious movement to a competitive operating environment for the entire telecommunications landscape.  Now that the Bell companies could control their own destiny, they began to venture into other areas and test their freedoms; i.e., they could not offer competitive services in their territory; but they could offer competitive services outside their individual region territory.  Wireless or cellular service was of particular interest and the bell companies were successful in grasping a major foothold in that arena.

[Click here to read part 2 of Mr. South’s article which discusses the post 1984 world and its implications, particularly on rural America.]

 Author:

This picture is a very nice one of independent broadband industry icon, Gene South.
Gene South

Gene R. South Sr. is a telecommunications and broadband professional with 45 years of experience including positions as EVP for Panhandle Telephone Cooperative in Guymon, OK; CEO / GM of Lakedale Communications in Annandale, MN and currently the V.P. & Director of Governmental Affairs for Lake Communications in Two Harbors, MN.  Mr. South served as Chairman of the Board of USTA, RTFC and MART; he also has held Board memberships for OPASTCO and MTA.  In addition, he has testified before congress and state legislatures.  

Rural America Needs Advanced Services & Competition – Part 2

[Editor’s note: Mr. South’s first article provided a brief history of AT&T as as a regulated monopoly and the forces that drove to the 1984 break-up of “Ma Bell”. Part 2 examines the aftermath of the break-up, particularly its impact on telecommunication services to rural areas.]

Click here to read Rural America Needs Advanced Services & Competition – Part 1

AT&T was immediately thrown into a competitive world, having to formally “interconnect” with its previous corporate subsidiaries and others.  The Bell Operating companies would continue their local regulated monopoly services with all the requirements (Carrier of Last Resort, etc.) associated with that position.  But the handwriting was on the wall…..the word of the day was competition and even the individual Regional Bell Companies soon looked at the other regions (and everyone else in the marketplace) as competitors.

Bottom line; the telecommunications world was turned upside down.  Local Access and Tandem Areas (LATAs) were designed….“Access Charges” was added to the industry glossary of terms… Organizations were created, such as NECA, ECSA, etc., to assist in the management of interconnecting carriers.  The complexity of this event was compounded due to the number of other participating carriers & companies, such as the 1,000+ non-Bell independent and cooperative exchange companies, the new long-distance carriers, wireless providers and the equipment manufacturers.

This far-reaching action was another step in what was an obvious movement to a competitive operating environment for the entire telecommunications landscape.  Now that the Bell companies could control their own destiny, they began to venture into other areas and test their freedoms; i.e., they could not offer competitive services in their territory; but they could offer competitive services outside their individual region territory.  Wireless or cellular service was of particular interest and the Baby Bell companies were successful in grasping a major foothold in that arena.

What was a closed marketplace with a limited number of participants was changed into a semi-open market with many providers (telcos, long distance companies, CATV, wireless, internet providers, satellite, etc.) and you could not tell a player without a scorecard…and even then it was tricky.  The convergence of services over the facilities provided by some companies compounded the complexity; Digital Subscriber Line (DSL) internet access service over the voice grade copper facilities of telcos; Voice Over Internet Protocol (VOIP) provided by internet service providers, etc.

It was becoming abundantly clear that there would be some winners and some losers.  What happened next were a significant number of mergers and acquisitions…each was an attempt to gain a stronger position in the market by increasing its footprint and enhancing its product line offering…. Even the very large Regional Bell companies were not excluded from consolidations.

The marketplace was partly competitive and partly regulated….The lines of demarcation were very fuzzy.  To say it was chaotic would be an understatement.  Regulators were constantly changing rules; putting out fires with little long-term direction for planning purposes.  It was evident that something had to be done to protect the American consumers and ensure that all consumers would realize the numerous benefits from this technological explosion.

Congress was compelled to step in to attempt to crystallize the telecommunications landscape for everyone; regulators, companies, consumers.  The effort would prove to be herculean and consumed lengthy discussions, hearings, comments, arguments and positions from all interested parties. After various draft bills, congress produced The Telecommunications Act of 1996 which was signed into law by the President.  The stated objective of the law was

To promote competition and reduce regulation in order to secure lower prices and higher quality services for American telecommunications consumers and encourage the rapid deployment of new telecommunications technologies”.

The message was crystal clear; advanced services were critical to the economic growth of America and that competition was the vehicle to deliver those benefits.  When before the signals were cloudy and piecemeal, America was now focused on a direction that offers the greatest benefit to all consumers.  This very significant congressional action sent the message that all future decisions would be measured against what is best for all American consumers.

This national debate went far beyond just Plain Old Telephone Service (POTS).  Historical discussions had dealt with, “Who was going to provide POTS to a certain community.” Now it is, “What provider can offer me with all my telecommunications, Internet & broadband services….today”.  This new debate further continued on with, “What provider can provide the required services to assist my community with education, health care, security, etc.”

It will not only be based on who is the provider (telcos, ISPs, CATV operators, satellite providers, private companies, etc.), how it is financed (private or public funded); or who manages the operation., but it will be decided on the ability to offer the most advanced services at the best prices in the timeliest manner to serve the consumer and the community.

If one company is not in a position to offer satisfactory responses to these issues for the community and its consumers; then these services will be provided by an organization(s) that step forward and is ready, willing and capable of the task(s).

Economic development and Consumer interests are the prime movers in these current debates……

The entire industry became under a magnifying glass; externally by media interests, Congress, regulators, consumer groups and policy makers.  Internally, industry players studied the market for more self-serving reasons.

Reports indicate that approximately 100 million Americans do not have broadband in their home.  Internationally, America has fallen behind other countries in the deployment of broadband services.  Domestically, consumers continue to demand advanced services/faster speeds; educators want better service (especially in rural areas); health care providers indicate that enhanced services could improve health care (especially in rural areas). These type of reports are getting significant media attention and many policy makers continue to express  concern.

A June 2013 snapshot of broadband wireline service availability of at least 3 Mb/s.
3 Mb/s or more BB Availability – US Broadband Map

Because Congress, the FCC, the NTIA and state agencies began to place a focus on telecommunications and advanced services, various activities were initiated to investigate and analyze the current state of affairs.  Various studies were undertaken….from a National Broadband Mapping project… to a study of where we are today and what is needed for the future.

In 2009, Congress charged the FCC with developing a National Broadband Plan to ensure every American has access to broadband capability.  The FCC conducted a hearing in November, 2009 to discuss specifically identified “barriers” that exist in formulating a new national broadband policy plan.  One major barrier was the Universal Service Fund.  The FCC Task Force believed that …

the fund should also be used to help subsidize the cost of deploying broadband in rural areas.”

A second barrier that was identified by the FCC Task Force was….

“the fact that broadband service providers tend to favor higher-income regions in more populated areas over low-income areas.  The data suggests that many low-income people in these parts of the country are offered only one broadband service option. The data also suggests that these consumers who have only one option tend to pay higher prices for service.

What this means is that lower-income people, who have less disposable income, are often the ones forced to pay higher prices, while people who have more money pay lower prices for service.

Deployments in rural areas are often affected by the high cost of building infrastructure and providing service. The task force noted that “middle mile” costs are almost three times higher than general network operations costs. This high cost is often a serious barrier to rural broadband deployments, the group said.”

The FCC Task Force conducted an extensive analysis and investigation into what would be required to implement a national broadband policy that would provide high-speed internet access to every American.

The results of the FCC efforts were documented in a comprehensive report unveiled on March 16, 2010 entitled:

“Connecting America: The National Broadband Plan”

The Plan stipulated that the:

Government can influence the broadband ecosystem in four ways:

  1. Design policies to ensure robust competition and, as a result maximize consumer welfare, innovation and investment.
  2. Ensure efficient allocation and management of assets government controls or influences, such as spectrum, poles, and rights-of-way, to encourage network upgrades and competitive entry.
  3. Reform current universal service mechanisms to support deployment of broadband and voice in high-cost areas; and ensure that low-income Americans can afford broadband; and in addition, support efforts to boost adoption and utilization.
  4. Reform laws, policies, standards and incentives to maximize the benefits of broadband in sectors government influences significantly, such as public education, health care and government operations.

The plan also recommended that the country adopt the following six Goals:

  1. At least 100 million U.S. homes should have affordable access to actual download speeds of at least 100 megabits per second and actual upload speeds of at least 50 megabits per second by the year 2020.
  2. The United States should lead the world in mobile innovation, with the fastest and most extensive wireless networks of any nation.
  3. Every American should have affordable access to robust broadband service, and the means and skills to subscribe if they so choose.
  4. Every American community should have affordable access to at least one gigabit per second broadband service to anchor institutions such as schools, hospitals, and government buildings.
  5. To ensure the safety of the American people, every first responder should have access to a nationwide, wireless, interoperable broadband public safety network.
  6. To ensure that America leads in the clean energy economy, every American should be able to use broadband to track and manage their real-time energy consumption.

The release of the National Broadband Plan (NBP) received significant media attention and great anticipation from the entire telecommunications, Internet & broadband segments of the marketplace.  Market participants reviewed all their plans and strategies to measure any impacts on their operations.  Existing broadband providers studied their markets to make investment decisions on the most attractive locations to allocate resources; i.e., where they need to move quickly and where they could delay deployment.  These decisions could be based on a variety of factors such as density, cost to install facilities, current competitors in the area and where it was believed they had a sense of control over that market area.

New entrants in the market conducted similar analysis; but they were starting from a position of limited information; they did have the Broadband Mapping information, but lacked consumer demand and cost data.  But they did believe that time was critical….”first in the market, and so on…”.

Some encouraging good news was that some necessary financing capital was available.  The President had made universal broadband access a key goal for America.  Economic Stimulus money was available in the form of grants and loans to approved providers.  Numerous applications were prepared and submitted for approval from existing providers and new entrants.

The other good news for the American consumer was that individual communities, in the form of local municipal or county organizations, became well aware of the importance of advanced services to their constituency and its economic growth.  In the past, when consumers complained about the lack of available services in the area, local government officials believed that their hands were tied.  Now, they saw what was taking place in other parts of the country and around the world and said…Why not us and why not here.

This community awakening was contagious and many community activists and organizers (including the general public, businesses, schools, and medical institutions) joined this very active movement.  The battle cry was…”What can we do to secure advanced services for our community”.  They did not want their taxpaying public consumers to become “second class citizens” and in certain situations the communities had run out of patience in being last in line for advanced services from competitors providing service in the area.  Municipal and county officials believed that they were doing their job and this is one of the reasons why their constituency trusted them with the responsibility to protect their interests.

The municipalities and / or counties established organizations, reviewed their existing situations, analyzed alternatives, met with constituencies, sought voter approval (if warranted), developed a strategy, prepared documentation, filed paperwork, sought financial assistance (grants or loans from federal government), established contracts with consultants and construction companies….and scheduled installation.

Of course, this activity met with some opposition from competitors in the area; who believed that they had the right to that area.  The answer to that issue is simple….If the perceived competitors had been providing acceptable level of advanced services in the area; it would not have been necessary for the municipality or county to take that action.  The fact is that a lot of existing companies have assumed “ownership” of the area and they believed that these consumers were obligated to wait until the competitor was ready to upgrade facilities in the area to provided advanced or broadband service.

Well, contrary to their opinion, today’s consumers just do not want to wait indefinitely….and educators, health care administrators, business operators, police forces and economic development councils do not want to wait at all….especially in Rural America !

This picture is a very nice one of independent broadband industry icon, Gene South.
Gene South

Gene R. South Sr. is a telecommunications and broadband professional with 45 years of experience including positions as EVP for Panhandle Telephone Cooperative in Guymon, OK; CEO / GM of Lakedale Communications in Annandale, MN and currently the V.P. & Director of Governmental Affairs for Lake Communications in Two Harbors, MN.*  Mr. South served as Chairman of the Board of USTA, RTFC and MART; he also has held Board memberships for OPASTCO and MTA.  In addition, he has testified before congress and state legislatures.

*Lake Communications, a private company, is building and operating Lake Connections for Lake County. Lake Connections is a local fiber-optic broadband provider owned by Lake County and formed to bring High-Speed Internet, Digital TV, and Voice services to Lake County and Eastern St. Louis County in northeastern Minnesota starting in 2014. 

 

 

All-IP Network Transition Plan at FCC's Jan 30th Open Commission Meeting

Introduction:

[dropshadowbox align=”right” effect=”raised” width=”270px” height=”” background_color=”#ffffff” border_width=”1″ border_color=”#dddddd” ][/dropshadowbox]During his January 8th speech at the Computer History Museum (CHM) , FCC Chairman Tom Wheeler told the CHM audience that the U.S. was in a transition to a “4th Network Revolution” that would be led by a transition to an “all-IP” network.   The 4th Network is actually a multi-faceted revolution based on  IP based packet communications (for voice, data and video) replacing digital circuit switching and analog transmission. Communications protocols are moving from circuit-switched Time-division Multiplexing (or TDM) to IP packet switching.  At the same time, 3G and 4G wireless access networks are increasingly prevalent, empowering consumers to connect at the place and time of their choosing.

Wheeler said, “The transition to an all-IP network is important in its own right, but it also is important because it demonstrates that the Commission (FCC) will adapt its regulatory approach to the networks and markets of the 21st century.”

The FCC Chairman then said that no one would use a network without being able to make a 911 phone call (to report emergencies and seek help from law enforcement). That implies that the all-IP network must support 911 calls in a consistent manner.

From the FCC Web site, an image of FCC Chairman, Tom Wheeler
FCC Chairman, Tom Wheeler (image courtesy of FCC.gov)

Wheeler told the CHM audience:

“The best way to speed technology transitions is to incent network innovation while preserving the enduring values that consumers and businesses have come to expect. Those values are all familiar: public safety, interconnection, competition, consumer protection and, of course, universal access. They are familiar, and they are fundamental.”

Continuing, he said: “At the January 30th Commission meeting, we will invite proposals for a series of experiments utilizing all-IP networks. We hope and expect that many proposed experiments, wired and wireless, will be forthcoming. Those experiments will allow the networks, their users, the FCC and the public to assess the impact and potential of all-IP networks on consumers, customers and businesses in all parts of our country, including rural America.”

All-IP Network Topic at the FCC’s January 30th Open Commission Meeting:

The all-IP network transition will be the number one agenda item at the FCC’s January 30th Open Commission Meeting  Advancing Technology Transitions While Protecting Network Values is all about the transition to an all-IP network.  “The Commission will consider a Report and Order, Notice of Proposed Rule making, and Notice of Inquiry that invites diverse technology transitions experiments to examine how to best accelerate technology transitions by preserving and enhancing the values consumers have come to expect from communication networks.”

In a November 19, 2013 blog post Wheeler provided an overview of the all-IP network migration.  He wrote: “The way forward is to encourage technological change while preserving the attributes of network services that customers have come to expect – that set of values we have begun to call the Network Compact.”

Wheeler noted various FCC Commissioner comments in that blog post:

  • “Commissioner Pai said that the FCC should ‘Embrace the future by expediting the IP Transition.’
  • Commissioner Rosenworcel told us that, ‘As we develop a new policy framework for IP networks, we must keep in mind the four enduring values that have always informed communications law — public safety, universal access, competition, and consumer protection.’
  • Commissioner Clyburn has called upon the Commission, ‘To carefully examine and collect data on the impact of technology transitions on consumers, public safety and competition.’”

AT&T Petition and FCC Technology Transitions Task Force are encouraging trials:

On November 7, 2012, AT&T petitioned the FCC to “Launch a Proceeding Concerning the TDM-to-IP Transition,” GN Docket No. 12-353 (AT&T Wire Center Trials Petition).

That document requested the FCC to “open a new proceeding to conduct, for a number of select wire centers, trial runs for a transition from legacy to next-generation services, including the retirement of TDM facilities and offerings” and that “the Commission should also seek public comment on how best to implement specific regulatory reforms within those wire centers on a trial basis.”

AT&T requested that the FCC consider conducting trials where certain equipment and services are retired and IP-based services are offered. These geographically limited trial runs, conducted after a public comment period on how they should be carried out, would help “guide the Commission’s nationwide efforts to facilitate the IP transition.” Such an approach, AT&T notes, will “enable the Commission to consider, from the ground up and on a competitively neutral basis, what, if any, legacy regulation remains appropriate after the IP transition.”

AT&T has set a date of 2020 to retire its TDM network and has been upgrading its IP-based service capabilities in its wireline markets via Project Velocity IP (VIP).  AT&T presented a progress report on the Project VIP at the June 2013 IEEE ComSocSCV meeting.  It can be read on pages 3-4 of this article: Telco Tours & Seminars Top ComSoc-SCV Activities.

Technology Transitions Policy Task Force” which was tasked to move forward with real-world trials to obtain data that will be helpful to the Commission. The goal of any trials would be to gather a factual record to help determine what policies are appropriate to promote investment and innovation, while protecting consumers, promoting competition, and ensuring that emerging all-Internet Protocol (IP) networks remain resilient.   The FCC task force is seeking public comment on several potential trials relating to the ongoing transitions from copper to fiber, from wireline to wireless, and from time-division multiplexing (TDM) to IP based packet switched networks.

Technology Trials Proposed:

The FCC task force has proposed the following trials related to the all-IP network transition:

  • VoIP Interconnection
  • Public Safety – NG911
  • Wireline to Wireless
  • Geographic All-IP Trials
  • Additional trials: numbering and related data bases, copper-to-fiber transition, retirement of copper?

 The US Telecom Association was very supportive of such trials as well as the previously referenced AT&T petition. In comments submitted on January 28, 2013, the trade organization wrote:

“The idea that the Commission should conduct real-world trials in order to better inform itself as to the technological and policy implications of the IP-transition is a way the Commission can continue its commitment to data-driven policy making. The Commission itself has urged carriers to ‘begin planning for the transition to IP-to-IP interconnection’ and the Commission-guided trials urged by AT&T would facilitate this effort.”

“In particular, the AT&T Petition offers an opportunity for the Commission and state regulators to conduct informative, but geographically limited, trial runs for regulatory reform in discrete wire centers. AT&T correctly notes that such an approach will enable the Commission to consider, from the ground up and on a competitively neutral basis, what, if any, legacy regulation remains appropriate after the IP transition.”

US Telecom’s comments can be read here.

Important Unanswered Issues for an all-IP network:

Transition to an “all-IP” network implies retiring the PSTN/POTs, TDM/circuit switching and all wireless networks other than 4G with VoIP over LTE. That is a huge undertaking that will be incredibly disruptive and take many years, if not decades, in our opinion.  Here are just a few points to ponder about this monumental transition:

  • Telcos and MSOs must universally deploy broadband for wireline VoIP to be ubiquitous. Currently, they make their deployment/build out decisions strategically- based on reasonable ROI.  Not every area in the U.S. has or will have wired broadband as a result.
  • Many rural areas have little or no wireless coverage and certainly not 4G-LTE.  What happens to people who live in those areas, e.g. Arnold, CA?
  • Even if wired or wireless broadband is available in many regions, there is likely to be only one or two network providers at most.  Hence, there is little or no choice in service which is effectively a monopoly. Santa Clara, CA is in the heart of Silicon Valley, yet we now have only two choices for wired broadband – AT&T or Comcast.
  • There is currently no Universal Service Fund/Lifeline or discounted rate (for low income folks) for VoIP service.  Lifeline service is ONLY available for the PSTN/POTS.
  • If an individual or family doesn’t want or can’t afford high speed Internet and/or broadband TV service, then it will most likely be uneconomical for the Telco/MSO to ONLY provide VoIP service over broadband access. This is the case for many poor people and older Americans!
  • Battery backup is required for an all-IP network to make emergency phone calls when power is lost.  There is a substantial monthly charge for a battery backup box for AT&T’s U-Verse VoIP service. An AT&T subscriber must also have battery backup power for the Wi-Fi gateway to enable your AT&T U-verse services to function during a power outage.
  • There will be a huge impact on business customers that use digital circuit switched networks if the proposed all-IP changes happen soon in the affected areas or “wire centers.” What if a company’s main or branch office site(s) are located in an all-IP wire center coverage area?  In that case, the business customer would have to give up it’s digital PBXs or hosted ISDN PRI voice trunks and move to SIP trunks–even though the company is not nearly ready for a total enterprise-wide transition to an IP voice network.
  • What happens to faxes, which are still overwhelmingly based on the analog PSTN and not IP fax? The death of fax has been predicted for over a decade, yet it is still alive and kicking!
  • There will be a huge impact on business customers that use digital circuit switched networks if the proposed all-IP changes happen soon in the affected areas or “wire centers.” What if a company’s main or branch office site(s) are located in an all-IP wire center coverage area?  In that case, the business customer would have to give up it’s digital PBXs or hosted ISDN PRI voice trunks and move to SIP trunks–even though the company is not nearly ready for a total enterprise-wide transition to an IP voice network.
  • The transition from the classic PSTN to an all IP infrastructure will mandate the end of Signaling System 7 and the entire infrastructure that supports it. This is a substantial undertaking, the consequences of which are not fully understood. Can SS7-based functions be replicated on a broadband IP-based network? What would be the equivalent of a “voice grade” circuit? Is a SIP connection a functional equivalent for the key functionalities of SS7 switches? What about SMS/texts?
  •  The telephone numbering system provides a way for callers served by virtually any service provided in the world to reach one another. What will replace that system has yet to be determined. It surely won’t be an IP address which is often dynamic and allocated for temporarily reaching IP endpoints.
  •  Interconnection and Inter-operability between IP and TDM networks is a work in progress-for both voice and data.
  •  Quality of Service/Reliability/Resiliency is largely unknown with an all IP network, which would need to scale to replace and reach all PSTN/TDM endpoints. What would constitute an “outage,” and how should “outage” data be collected and evaluated? Here again, the battery back-up on power fail would need to be made mandatory and low cost or no cost to consumers and enterprises.

For sure, the above issues will challenge equipment vendors, regulators, business and consumers. We think the transition from PSTN/TDM/digital circuit switched to an all-IP packet network will take much, much longer than many expect.

Google Fiber – A Step Function Connectivity Improvement

A step function improvement in capability is how Milo Medin described Google’s Kansas City fiber project at the February 13th IEEE ComSoc meeting in Santa Clara. That huge improvement in customer experience is in contrast to the incremental gains of MSO [Multiple System Operator] and telco broadband networks, which have much lower access speeds.

Picture of a boom truck with a technician pulling fiber on an existing utility pole line for Google.
Image Courtesy of Google

Medin, who is VP of Access for Google, described a Gigabit/second fiber network that eliminates the bottleneck between home and the cloud, unleashing new applications and devices both in the home and, by implication, throughout a city. Google’s incremental improvements in its construction and operations, its relatively simple offering and its grass-root marketing are as important to its success as its innovative fiber and home networking technologies.

The story of Google Fiber is pretty well-known by now; Google issued an RFI a couple of years ago to which 1,100 cities responded to be the test bed for Google’s fiber to the home project. What isn’t so well-known is that the motivation for this was the middling price/bandwidth performance of the U.S. as compared to other countries. Medin, who was a key figure in the early success of cable modems through his affiliation with @Home, suggested that, instead of complaining to government, Google decided to solve the problem. The unexpected response of so many communities was a surprise to Google and, according to Medin, an indicator of a pent-up demand.

Interestingly, government turns out to be part of the reason for their success, but not in the form of subsidies or tax breaks. The techniques Google and the local city are using to streamline the permit process and literally work together is saving an estimated 2% of the build cost. Similarly, attachment of fiber to the poles is made somewhat easier because the local utility is municipally owned.

Thanks a Bunch CEQA, No Google Fiber for California

Rules and regulations are definitely shaping where and how the service will develop. Echoing testimony before Congress, Medin suggested that as long as CEQA [California Environmental Quality Act] is in place in its current form, Google Fiber will be virtually non-existent in California (there is a 850 home Google FTTH project on the Stanford campus). The irony that Google’s home state will not see its fiber network anytime soon was not lost in the room full of engineers at the IEEE meeting.

Medin explained that anyone can use CEQA to initiate a lawsuit to block a development. He cited the example of the use of CEQA to delay the rollout of Uverse in San Francisco for years. A linchpin of Google’s approach is achieving scale at a fast-rate and the uncertainty caused by CEQA sinks their business case. And there is a business case, as Medin pointed out that the margins on broadband are as high as 95% for incumbent providers in urban areas.

To critics who suggest an infrastructure play is far afield for a “search” company, they should think again:

  • With YouTube and their other Google properties, Google already operates one of the world’s largest Content Delivery Networks
  • With a Fiber to the Home network, outside plant maintenance is almost zero, as compared to a traditional cable or telephone network.
  • With a gigabit connection and customized hardware, the home becomes an extension of their data centers. Although it wasn’t said in his talk, they are sure to have TR-069 or equivalent technology to allow the monitoring of devices within the home. Additionally, network managed WiFi routers integrated into each set-top will deliver a better experience than the home WiFi networks cobbled together by consumers.
Depicted is a Google optical/electrical converter (ONT) that resides in the home. What's not clear is whether or not it has built-in battery back-up.
Image Courtesy of Google

Google is taking an approach that, in some ways, is reminiscent of the old Ma Bell, whereby Google designs their own equipment. From Optical Network Terminals [ONTs] to set-top boxes, Google has created devices that maximize the customer experience [a DVR that records 8 programs at once] and minimizes operational cost. Medin indicated that Google has some of the world’s best optic engineers on staff.

Unlike the days of Ma Bell, Google can work with third-party manufacturers to build what they need, allowing them to introduce devices without the overhead burden of owning factories.

Keep It Simple Marketing

As with Google’s other offerings, they are taking a brand follows product approach to their fiber product. That is, the end service is the focus on creating an offer that provides great value and a high customer loyalty/buzz factor that will essentially market and sell itself. Like Google’s approach to their search home page, Google is keeping their offer simple. Unlike the Chinese food menu of a seemingly infinite number of tiers that traditional video and broadband operators offer, Google has only three tiers:

  • 5 Mb/s  with $300 construction charge (may be amortized at $25/month for 12 months) & no recurring charges for 7 years
  • 1 Gb/s broadband with 1 Terabyte storage For $70 per month
  • 1 Gb/s broadband with video for $120 per month with a Nexus 7 as a remote control

These three offerings probably cover 95% of the market. Amortized over 7 years, the 5 Mb/s tier exceeds the National Broadband Plan’s minimum at a very affordable rate of less than $4 per month and serves those who can least afford broadband. The $120 per month tier includes a basic level of video that many people would like. On a dollar per bit basis, the $70 provides great value to cord-cutters, while providing a superior broadband option for those who do not want to switch their existing video providers.

Further simplifying their offering is the decision they made not to offer telephone as part of their bundle. Although this decision was made for regulatory reasons, this reduces the operational complexity of their network and minimizes the staff required to run their network. With one less complex feature to offer, their network implementation is faster. They probably don’t lose much of their Total Addressable Market, given the number of people who are either wireless only or can easily pick a VoIP service (including Google Voice, which works great with a Obihai VoIP adapter).

A picture of a Google truck at a customer install. Note, the lawn sign promoting the Google Fiber project.
Image Courtesy of Google

Like what so many independent, rural operators have done with their Fiber to the Home deployments, Google is taking a grass-roots approach to marketing. Google uses a crowd-sourcing technique to determine where to build. Instead of taking a top-down approach that focus on demographics, Google split the Kansas City market into neighborhoods. When a critical mass of people commit to service in a given neighborhood, Google builds out that area creating what they call a “Fiberhood”.

Where they build is thus dependent upon the citizens of a given neighborhood. Like the way it has marketed its other Internet businesses, Google is betting on and seeding efforts to create a viral buzz about their network. One of the more interesting developments is their retail store. Although not mentioned in a recent Wall Street Journal article about Google’s rumored jump into retail, this point of presence offers a physical location to educate potential customers and the local influencers who will help sell their neighbors on the service.

And this approach seems to be working as Medin reported that in some neighborhoods 50% of the residents are committing to Google Fiber prior to build.

Just the Beginning

The Google Fiber Space retail store hints at some of the developments in the future that a gigabit network enables.
Image Courtesy of Google

A gigabit to the home with its low latency and high-speed brings the compute power of the cloud to the home; particularly when much of the content is cached locally within Kansas City. In a sense, this extends Google’s cloud platform to the home and business, such that the performance at the end point is virtually the same as what it would be in the data center. Medin hinted that 1 Gb/s is just a start. It is not too difficult to imagine the types of things that could be enabled with this sort of bandwidth, such as:

  • City-wide WiFi or some other wireless solution (Google has been received FCC authorization to experiment with various wireless approaches for access). City-wide wireless could offer a low-cost mobile/nomadic solution for its customers. It could also be important for autonomous transit options.
  • Distributed data centers – with 1 Gb/s connections, a Peer to Peer compute network (think connection of those DVRs) becomes a possibility. Why not use the computing power as well and create a virtual data center spread over hundreds of thousands of residences.
  • Like what Google has done with its Android and Chrome operating systems, the fiber network has the potential to enable applications from third-parties. It is possible that some of these apps might even come from existing telecom providers.

The Google fiber project in Kansas City is on its way to meeting its goal as a showcase of how low latency, Gigabit per second bandwidth can transform a city one neighborhood at a time. The fiber is really serving as a last mile nervous system that connects the seemingly disparate pieces to an ever-expanding Google ecosystem, which is where the change will really take place. Unfortunately for California residents, and particularly ironic for Silicon Valley residents, new Google Fiberhoods won’t be making their way to the Golden State anytime soon.

[Author’s Note: Thank you IEEE for the facilitating the excellent program that featured Medin as one of the speakers and thank you Alan Weissberger for your editing assistance].

AT&T to Expand U-Verse & IP-DSLAM; Bring Fiber to Commercial Buildings & Cover 99% of US with LTE!

In the most significant announcement since SBC acquired the old AT&T and became “the new” AT&T, the telco giant announced it will spend $14B over the next three years to expand its wireline and wireless networks under its newly coined “Project Velocity” initiative.  The company wants to move to an all IP network platform, which means they’ll be phasing out TDM transmission and the PSTN.

Surprising most analysts, AT&T said $6B of that $14B will be spent on wireline upgrades.  In particular:

1.  Residential Broadband via U-Verse, IP-DSLAM and (in some rural areas) LTE:

Traditional U-Verse (TV, high-speed Internet, VoIP) as well as U-Verse IP-DSLAM (high-speed Internet, VoIP, but NO TV service) will be available in many more areas with Internet access speeds of 75M b/sec for most customers, with many achieving speeds of up to 100 M b/sec (downstream).

Currently, 32% of AT&Ts customers are covered by (triple play) U-Verse, which will increase by one-third to 8.5M additional customers and 43% coverage by the end of 2015.   U-Verse revenues are running at a $9B annual rate and increasing at a 38.6% annual rate.  AT&T is making “customer retention improvements in all areas.”  (Presumably to avoid losing U-Verse customers to triple play MSO services, e.g Comcast/Xfinity which runs commercials enticing U-Verse customers to come back to Comcast for better high speed Internet and TV service).

AT&Ts wired IP broadband network will expand to 75 percent of residential customer locations in AT&T’s 22-state wireline service area by year-end 2015.  Not all of those potential customers will be able to get U-Verse TV service.  Rather, they will be connected to IP DSLAMs to achieve higher speed Internet access.  (This means AT&T will be jettisoning its ATM over ADSL network in favor of IP/Ethernet transport to/from customer premises to its initial point of presence where the DSLAM resides.  Customers currently using ATM over ADSL will have to be retrofitted with new CPE to access U-Verse IP DSLAM or traditional U-Verse).

AT&T's wireline coverage
Image Courtesy of AT&T

The higher Internet access speeds (over last mile copper) for U-Verse and IP DSLAM will be achieved by VDSL pair bonding, “small form electronics,” and VDSL vectoring.

Status Report: Currently about 9% of AT&T landline customers cannot get broadband, while 32% have the U-Verse triple play available to them, 32% have U-Verse IP DSLAM available and 27% are served via legacy broadband (i.e. non-IP DSL).

By the end of 2015, AT&T said 99% of customers in its 22 state service area will have broadband available to them via either via wireline or LTE option.  The breakdown is as follows: 43% of customers will have access to the U-verse triple play, 32% will have access to the U-verse IP DSLAM and the remaining 25% will need to rely on LTE, which will be available to 99% of AT&T’s customer base.

Other key points related to residential broadband are as follows:

  • Much higher speed wireline Internet, via either U-Verse triple play or U-Verse IP DSLAM (double play) will be available to 57M AT&T customer locations by 2015.  The IP DSLAM double play offering of broadband Internet and VoIP will be available to 24 million customer locations by year-end 2013.  Those customers will be offered  a triple play bundle based on IP DSLAM and satellite video service from Dish network.
  • Customers in rural or remote locations (presumably the 25% who won’t get wireline broadband access) will be able to get  LTE wireless access,  as AT&T plans to extend its 4G LTE build out to cover 300M POPs by end of 2014.
  • High speed IP connectivity will be available  to 99% of wireline service?area customers (via either U-Verse, IP-DSLAM or LTE) by 2015.

Summing up,  AT&T’s firmly believes that:

  • Wireline IP broadband is structurally attractive in dense population areas
  • IP broadband is the most important product in the triple or quad-play bundle
  • AT&T IP broadband will meet customers’ growing speed requirements
  • Significant synergies exist between wireless and wireline assets

2.  Fiber to the Building deployment:

AT&T will light fiber to reach 1 million additional business customer locations, covering 50 percent of multi-tenant office buildings in AT&T’s wireline service area by year-end 2015.  50% of those multi-tenant office buildings in AT&Ts wireline service area will be fiber connected. (That’s up from about 15% nationwide today).

3.  Strategic Business services:

IP VPN, Carrier Ethernet, (Web server) hosting and vaious  managed business services generated $6.4B in revenues last year  and is growing at 14.5% annually.  Wireline data and managed IT services for enterprise customers are growing at a rate in excess of 6%.

Cloud computing and security are seen as the next big growth opportunities as AT&T transitions to managed services for its enterprise customers. In particular, AT&T plans to partner with cloud service providers as well as providing cloud services over their own managed IP network that leverages performance, reliability and security.   During the Analyst Day webcast, AT&T said, “Virtualization and mobilization are driving the need for a ubiquitous, dense wireline footprint solutions that bundle cloud with connectivity (AKA Cloud Networking), symmetrical bandwidth, and security through active network management.”

Century Link/Savvis and Verizon/Terremark  are recognized cloud leaders each having very solid cloud computing with managed IP VPNs for delivery of cloud services.   They will now have much more competition from AT&T in the cloud space.


Additional information on U-Verse:

As indicated in the graph below, AT&Ts broadband market share is growing in areas where U-Verse is available to residential customers.

Image Courtesy of AT&T

U-Verse has delivered 5 years of top line growth for AT&T:

  • $9.5B revenues, which are growing 38% Year over Year
  • 7.1M IP broadband subscribers, with 2.5M added in last 12 months
  • 4.3M IPTV subscribers, 760K gained in last 12 months
  • 18% U-verse video penetration; 23% U-verse broadband penetration
  • ~$170 ARPU for U-verse triple-play service bundle
Synergies between AT&Ts wireline and wireless networks:

At a Wells Fargo investment conference on November 8th,  AT&Ts VP & CFO John Stephens said that AT&T evaluated commercial buildings with six tenants or more to determine whether they should get fiber connected.  Considerations included: distance from AT&Ts central office (CO), cost efficiency and build-out cost.

A huge side benefit for AT&T is that once the fiber to the building is installed and AT&T owns the right of way, the company will install Distributed Antenna Systems (DAS) along the fiber route to provide increased 3G/LTE wireless coverage.  The DAS’s would use fiber backhaul to AT&Ts CO.  Mr. Stephens hinted that DAS’s (deployed along the fiber-to-the-building route) might also be used for  broadband wireless offload, but did not disclose any details how that might work or be configured.

[Infonetics analyst Stéphane Téral recently said in an email, “The majority of operators are still using distributed antennas (DAS) in their mobile networks for coverage, and despite all the talk about using small cells to boost capacity in large venues, operators we interviewed believe DAS will remain a fundamental tool for malls, airports, stadiums and the like.”]

Mr. Stephens was both enthusiastic and confident during his presentation.  He said, “AT&T is investing in tried and true things we know.  We are moving away from PSTN and torward an all IP network platform for delivery of all telecom services including voice.”

Closing Comment:

During its November 7th Analyst Day webcast, AT&T CEO  Randall Stephenson echoed Mr. Stephens confidence, “These are things we’ve done before – logical extensions of proven technologies and already successful businesses. We are very confident in our ability to execute this plan.”

Reference:

http://www.att.com/gen/press-room?pid=23506&cdvn=news&newsarticleid=35661

Summary of Telecom Council TC3, Part 1- Service Provider Innovation Forum

Introduction:

Telecom Council Carrier Connections (TC3) – the Telecom Council’s annual summit- was held Sept 12-13, 2012 in Sunnyvale, CA. The event provides an opportunity for startups and application developers to interact with telecom carriers (telcos) and network operators. Telco representatives who manage innovation, from developer programs and labs facilities to venture investing, discussed many issues that are relevant to their vendors and partner companies.  These included:

  • What innovations are network operators looking for?
  • How does a young company work with a large operator?
  • What kind of partnerships do carriers prefer?
  • Who are the right people inside the carriers to properly receive, handle, and implement new ideas?
  • What developer and partner programs are available?

A complete description of TC3 is available here from the Telecom Council.

Telco Innovation in SF Bay Area and Startups:

Over the last few years, more than 25 Telco Innovation Labs have opened up in the SF Bay Area, including Sprint’s in Burlingame, AT&T’s in Palo Alto, Verizon’s in San Francisco and Deutsche Telekom’s in Palo Alto.  These Telco Innovation Labs serve as incubators and offer testing facilities to a wave of startups, particularly in the wireless space.  Global telcos have also established Venture Capital (VC) divisions throughout the SF Bay Area.  This makes Silicon Valley a very appropriate place to hold the TC3 summit conference. Throughout the 2 day summit, speakers from telcos and mobile operators described what they’re doing for developers and how they’ve been handling partnerships with startups.

Telecom Council Service Provider Innovation Forum (SPIF) Meeting:

TC3 conference chair Derek Kerton said that the Silicon Valley culture of co-operation has been working for carriers. They are able to share leads and help each other out without worrying about competition. With that introduction, the first Telecom Council Service Provider Innovation Forum Meeting open to the public began. 11 later stage startups, with network ready products and services that “push the envelope of telecom innovation” gave rapid fire pitches. After each presentation, SPIF session moderator Liz Kerton invited carriers sitting in the front rows to ask them questions.

We highlight three of the most interesting vendor rapid fires below.  Click here for the complete TC3-2012 agenda.

1.  Actellis has offered Ethernet over Copper products, but is now shifting into the residential broadband space. Millions of Americans don’t have broadband access, primarily due to a lack of infrastructure. The FCC is trying to address this problem with the Connect America Fund. “The digital divide is a challenge, but an economic opportunity for carriers,” said Chris Heinemann, Director of Marketing at Actellis.

The Actellis Broadband Accelerator (BBA) delivers high speed broadband services to current unserved and underserved customers who are out of reach because of their geographical location.  The patented, shoe box sized box is placed between the telco’s DSLAM and the ADSL subscriber.  It provides “ubiquitous coverage over the existing copper infrastructure and takes only 15 minutes to install. The BBA is in field trials worldwide, with several deployments in the U.S. and one in South America,” according to Mr. Heineman. He encouraged the audience to watch You Tube videos describing the product and how to install it (wall or pole mounted). Please refer to:

Overview (YouTube video)

How to Install (YouTube video)

The BBA received the 2012 NGN Leadership Award for outstanding innovation.

Author’s Note: Mr Heinemann did not disclose how the Broadband Accelerator actually worked.  Yet he implied it could be used to deliver 15 to 30 Mb/sec total bandwidth per subscriber (the sweet spot for triple play services).

2.  Joyent provides “Cloud Infrastructure for Real-time Web and Mobile Applications.” The company, which counts Intel and Telefonica as investors, “builds a data center as a solid state device,” according to Jason Hoffman, Joyent’s founder. The company’s strategy is focused on local service delivery from a global alliance of tier 1 mobile carriers that operate their own mobile clouds and/or Infrastructure as a Service (IaaS) on Joyent’s data center fabric.

Joyent’s data center technology addresses challenges in real time latency sensitive mobile apps. “It’s designed as the back end of the storage array that runs Virtual Machines (VMs). The product can do throttling, scheduling, bursting and I/O acceleration in a unique way,” according to Mr Hoffman. “It can detect when applications are running slow via real time diagnostics and trace capabilities,” he said.

3.  Shared Spectrum Company is not technically a startup as it was founded in 2000 and funded by DARPA.  The company develops embedded wireless software for accessing shared spectrum resources and mitigating effects of RF interference by avoiding those bands. Their Dynamic Spectrum Access technology senses what frequencies are used as well as interference in unused bands. It avoids those and switches wireless traffic to selected frequency bands that are unused and clear of interference. Shared Spectrum’s software has been embedded in products from military radio manufacturers. Recently it has been used by InterDigital- a femtocell vendor. The company is now hoping to attract a broader range of OEMs (as described below).

Based on measurements his company has performed in major markets around the country, CEO Tom Stroup claims there is no spectrum shortage (in direct conflict with AT&T’s CEO Randall Stephenson who says AT&T needs a whole lot more spectrum to cope with exponential growth in mobile data traffic).

Instead, Mr. Stroup maintains that most allocated spectrum is not used.  He said that <20% of available and allocated spectrum is not in use at any one given time. That’s quite a bold statement!

The company sees a growth market in mobile cloud computing, which requires additional spectrum with QoS. Examples are TV white spaces (unused frequencies allocated to TV broadcasters) where interference from wireless microphones must be detected so as not to use those bands for wireless broadband services. The company’s “Spectrum Sensing Toolbox” is targeted at equipment used in femtocells, IEEE 802.22 Regional Wireless Area Networks, digital broadcasters in Europe, Machine to Machine (M2M) devices, Department of Defense and civilian government radio systems. Tom said that “Shared Spectrum’s Dynamic Spectrum Access technology was applicable across the world.”

Author’s Note: Various proposals, including IEEE 802.11af, IEEE 802.22 and those from the White Spaces Coalition, have advocated using white spaces left by the termination of analog TV to provide wireless broadband Internet access. A device intended to use these available channels is referred to as a “white-spaces device.” The FCC will meet September 28th to discuss rules for an auction where UHF broadcasters will sell spectrum to wireless carriers that have complained about a lack of available spectrum as U.S. consumers increasingly use and want more mobile data.

Closing Comment:

“Our monthly Service Provider Innovation Forum meeting has a 12 year history of helping entrepreneurs meet telcos and visa versa,” said Liz Kerton President of the Telecom Council of Silicon Valley. “This was the first time the public has had access to the inner circle of the Council and it worked well enough to do it again next year.”

Indeed, this author found the meeting a very effective way for telcos and start-ups to meet one another.

—————————————————————————————————————————————————————–

This concludes part 1 of the TC3 Summary.

Part 2 will cover the panel session on Rich Communications Suite (RCS) and carrier innovation agendas, strategies and case studies.  Part 3 will be on WiFi Offload 2.0.

Reference:

SPIFFY award winners at TC3 in various categories-

http://viodi.com/2012/09/25/entrepreneur-forum-carrier-perspectives-summary-of-telecom-council-tc3-part-2/

 

Does Broadband Lead to a Broad Waist?

In late August, the Milken Institute issued a provocative study, Waistlines of the World – The Effect of Information and Communications Technology on Obesity. This report looked at 27 OECD countries between the period of 1988-2009 and the impact of knowledge-based society on obesity rates in those countries. The growth of these rates are a serious concern, as overweight and obesity together are the fifth leading cause of death worldwide, according to the report’s citation of World Health Organization statistics.

The report suggests that,

“For every 10 percentage point increase in the share of ICT spending, obesity rates will significantly rise by 1 percentage point directly and 0.4 percentage point indirectly based on the impact of additional consumption of leisure ‘screen’ time.”

The U.S. has the highest rate of obesity, leaping from 23.3% to 33.8% from 1991 to 2008. In absolute numbers and percentage growth, China’s obesity rate is a huge concern as it more than doubled between 2002 and 2008 from 2.5 to 5.7 percent and the number of overweight people doubled from 1991 to 2006.

The report is loaded with statistics like the aforementioned, but one assumption that is a given in the report, is that, “Urbanization, in general, leads to a more sedentary lifestyle and hence weight gain.”  The implication is that urban areas will see higher obesity rates than will rural areas. While this may be true in developing countries, based on the study, it is not clear whether this is the case in already-urbanized countries, such as the United States.

Information and communication technology (ICT), as defined in the report, is somewhat expansive and includes,

“Information technology (IT), unified communications, telecommunications (telephone lines and wireless signals), broadcast media, all types of audio and video processing and transmission, and network-based control and monitoring functions.”

The report suggests the world’s transition to a knowledge-based society has led to changes in work habits (less manual labor, more dual income families) and lifestyles (increasing urbanization, greater caloric intake, more screen time), which lead to obesity. Backing up their conclusions are complex econometric models, that are way beyond the intellectual firepower of this reporter, but common sense says more screen time leads to a bigger waistline.

More common sense is embodied in their citation of the “last hour” rule, which,

“basically states that when the enjoyment associated with technological advances increases in sedentary leisure, people will devote more time to sedentary entertainment at the margin.”

In other words, most people will kick-off their shoes and sit in front of a screen or screens at the end of the day instead of exercising.

Prescription for Change 

Spring Grove Fitness Center

Their prescription for change involves governmental and employer assistance to help make exercise part of everyday life; like it was prior to screen time. Again, more common sense, but they recommend policies and programs that reduce reliance on vehicle transportation and encourage walking, bike riding  and, of course, exercise programs.

Some 27 existing government and corporate programs from around the world are cited in their list. A concept of interest to telecommunications providers include their suggestion of a tighter coupling between health-care providers and people via things such as keeping track of biometric data. Another example a program that ties patients closer to health-care professionals is one started by a Ohio physician called “Walk with a Doc.”

Employee Fitness Program

Two programs that could be added to their list are from rural broadband operators. Spring Grove Communications, as seen in this 2010 interview, built a gym/library/community center, when it rebuilt its office; benefiting its employees, as well as the community at large. Arrowhead Electric has a simple program that rewards employees for starting and sticking to an exercise program. Both programs represent ways to help people proactively prevent broadband usage (and other screen time) from leading to a broad waistline.

OTT movies equal VOD with more gardens to pick from

rog-tv-ott-720Sometimes I think about keeping thoughts to myself as they often conflict with what people want to hear or contradict what they believe and then I feel bad and then it happens and then the cycle repeats

So here we are at the refresh of another cycle. I’m about to feel bad. Over-the-top progressed faster than I thought and gave more options than I thought, but now I think OTT has done better than VOD because it offers more options, probably did a better job marketing, and will likely dominate in the future because of its options. To me OTT movies are simply VOD out of someone elses garden but I have a vehicle driving me to many more gardens to pick from.

So what got me thinking about this cycle? Well, I finally got a Roku box which I thought would stop me from plugging my PC into the TV, but it didn’t. Next I read that people think VOD is losing out to OTT because it has inadequate advertising support and awkward program guides. This was in a report summary sent out by TDG recently and repeated by many others…. Market Watch, Broadband and TV News, Marketwire, WCBSTV, etc.

Yes, I’ve been told that I’m not reading between the lines of the summary and it’s really about cable operators losing a market they should have controlled. So perhaps, yes, VOD should have done better, but how can it compete against all the options.

  • Roku gives me the option to pay a monthly fee for Netflix, or to rent by the movie from Amazon, or to watch for free on the Crackle channel with advertisements. Those are only 3 options out of many more options.
  • Playstation, Wii, and xBox give me many of the same options as Roku and at the same time provide entertaining game that are owned by perhaps 3 or 4 times as many people that have VOD boxes. I have all 3.
  • I can still plug my PC into the TV and rent YouTube movies or watch for free with a commercial. I often go to YouTube for free movies because I prefer the one opening commercial to Crackle’s one every 15 or 20 minutes.
  • And there are more media players out there with similar options that I won’t even get started on.

So why would VOD dominate the movie watching world? I just don’t see it… or should I say watch it.

Now please don’t be offended it you’re a VOD provider or supporter. This is just the way I look at things and I’m an older baby boomer. I’m not the one to watch out for like x, y, z generation that will decide how this plays out. The younger generations are way more exposed to this stuff than this atypical bb gen’r but a lot of operators recognize this as they focus on broadband so the future is looking brighter for options.