1. Start by knowing just what HD voice is. Despite its reputation for realistic-sounding audio, HD voice doesn't deliver from speaker to listener the full spectrum of sounds audible in a real-life conversation. It does, however, transmit more than double the range of audio frequencies that traditional phone calls do. PSTN connections, sampling voice audio 8,000 times per second, transmit frequencies from roughly 300 Hz to 3,400 Hz, a range of just over 3 KHz. HD voice technology, sampling the audio 16,000 times per second, can transmit frequencies from around 50 Hz to 7,000 Hz, a range of almost 7 KHz. In concrete terms, that means HD calls can carry sounds a full two octaves lower, as well as sounds significantly higher, than those in conventional calls.
The difference is important because humans can detect frequencies ranging from 20 Hz to 20,000 Hz (20 KHz), and human speech typically spans a 10-KHz to 12-KHz band within that range. Thus a PSTN call delivers sounds representing only a fraction of the range one would hear in a live conversation — the rest is deliberately filtered out. And HD voice, though still not transmitting all audible frequencies, delivers enough of the ones relevant to human speech that many listeners say it seems like the speaker is in the same room.
2. Understand the real-world limitations of conventional PSTN voice. Understanding how much easier HD makes communicating by phone requires first understanding how hard traditional voice technology makes it. That's difficult to grasp because we're so accustomed to the limitations of the PSTN that we hardly notice them. But they're very real. To start with, cutting off so many high and low sounds makes distinguishing between s and f, or other similar sounds, a major challenge. Likewise, many individuals have voices that are significantly higher- or lower-pitched than others'. Thus in conventional conference calls it's often difficult to know who is speaking. It's similarly hard to understand people with different accents — an increasingly important consideration when even small businesses are often global operations. It's also easy to miss subtle verbal cues that come through in live conversations.
Such limitations impose significant penalties on call participants. For one thing, they force listeners to make extra efforts to understand what is being said and who is saying it, by interpolating or simply guessing. That can be mentally exhausting, particularly during long conference calls. And if participants have trouble discerning, for example, whether a key number was $15 million or $50 million, or which executive was in favor of a key proposal, it can even be disturbing.
3. Understand what HD voice will and won't do for you. The main benefits of HD voice lie simply in eliminating such penalties. Long calls, especially conference calls with multiple speakers with different accents, become less tiring and thus more productive. Participants are able to spend less time calling back afterward to clarify what various speakers said, or worrying that they got it wrong. Other benefits are less tangible. For example, because it lets so many more verbal nuances come through, HD voice may improve a company's rapport and image with whatever customers are involved in such calls. On the other hand, it definitely won't provide the kind of clear-cut ROI, such as measurable savings on long-distance charges, that moving to VoIP transport, for example, does.
4. Get HD-capable phones, not HD-compatible ones. Jeff Rodman, Polycom co-founder and voice division CTO, argues that IP phones with HD codecs such as the ITU-standardized G.722 may not necessarily provide true HD voice quality. That's because it takes more than a codec to deliver the full range of sound that the codecs can theoretically transmit. For example, HD-capable microphones and speakers are also necessary, as well as careful acoustic design. Polycom and snom have been particularly aggressive about building IP desk phones that take full advantage of HD voice, and AudioCodes is moving in the same direction. Softphones, the client software programs that allow users to make VoIP calls through headset-equipped PCs, can also provide HD capability.
5. Make sure your IP PBX is HD-compatible. For HD phones to work, whatever IP PBX they are connected to has to recognize the codecs they incorporate. Most IP PBXes these days are G.722-aware, but many IP phones contain other codecs in addition to the basic ITU standard. Making an IP PBX compatible with a given codec is a software upgrade, though compatibility testing is always necessary. Either way, make sure the different types of equipment you're using or planning to buy work together.
6. Make sure your network equipment can handle it. Polycom's Rodman also noted that in certain cases wideband codecs will increase the bandwidth requirements of phone calls. While the G.722 codec requires the same bandwidth per call as the narrowband G.711, some existing VoIP setups using high-compression codecs may be designed to use even less bandwidth. In such cases, moving to HD voice may not only increase bandwidth requirements, but also put a strain on queues, buffers and other elements of your LAN and WAN equipment.
7. Analyze your telecom connections and service. HD phones aren't much help if the connection between them can't support HD calls. But that's exactly what happens whenever a call travels over the PSTN for even a portion of its journey — the entire call becomes narrowband. There's not much you can do about it when you're making or receiving calls to or from outside companies or individuals. You have much more control, though, for calls between branches of your own company. Choosing a provider that offers end-to-end SIP (session initiation protocol) connections, for example, will also let you have end-to-end HD voice. And it will do so even if you have different types of HD phones or softphones at different locations. When it first sets up a call, SIP checks the endpoints, in this case the IP phones, to find out what codecs they have in common. It then transmits the call using the protocol that will provide the highest-quality connection.
8. Consider HD-capable hosted VoIP. Like IP PBX vendors, hosted IP PBX providers are seeing the light when it comes to HD voice. Companies such as Alteva, Apptix, FreedomIQ, IP5280 and LightEdge Solutions, for example, offer services that can use HD-capable phones from Polycom. And snom has introduced a number of phones incorporating the HD technology it calls klarVOICE wideband audio, which it is targeting at so-called standards-based (i.e. SIP-based) hosted VoIP services, among other applications.
9. Use an HD voice-capable conferencing service. As noted, conference calls are one place where you want to make sure everyone can hear everyone else as clearly as possible. Most commercial conferencing services, however, focus almost exclusively on PSTN connectivity. A Citrix Online service called HiDef Conferencing, formerly Vapps, similarly allows callers to dial in from the PSTN. But because it is integrated with Skype, any or all participants can alternatively call in on the Internet phone service. Those who do will hear at least each other in Skype's full HD glory. Similarly, a number of Web conferencing services offer a VoIP option with HD capability for the audio component of their online meetings. And VoIP conferencing service ZipDX, which supports HD (which it calls wideband) voice conferencing with Polycom phones, recently announced the integration of its service with snom klarVOICE phones as well.
10. Be aware — or beware — of background noise. Remember that HD voice connections transmit a lot of sounds that PSTN calls don't. Listeners will hear every rustle of a paper, every click of a keyboard, every conversation in the hallway and every sound from the next cubicle. One soon learns that "potato chips have different ranges of crunches," noted Cailin Pitcher, senior marketing manager for Citrix Online.
11. Wait for the world to catch up to you. Even with HD equipment and SIP phone service, you may not have a lot of HD voice calls to start. The ones you do have will mainly be internal calls between branches. For outside calls, you'll be stuck with mostly non-HD connections for some time. That's because end-to-end HD calls across different service providers' networks require that the providers have interconnected IP voice networks, with no transit over the PSTN. And there won't be enough such interconnections to make external HD calls commonplace for years.
That doesn't mean you shouldn't go with HD right now if it otherwise makes sense. The phones you buy today will typically have a useful life of six to seven years, or longer. And they should (make sure of that, too) be firmware upgradeable for compatibility with new codecs as those come out. By that time, there will be a lot more interconnected provider VoIP networks, so even calls to and from outside customers or suppliers could be HD as well. In the meantime, your own internal communications will improve dramatically. That alone could make the investment worthwhile.
Some steps the maximum benefit from HD voice.
Thursday, June 4, 2009
Posted by muhammad abbas at 2:29 AM 0 comments
How to Benefit From HD Voice
It takes more than HD phones to make calls sound like they're coming from the same room.
HD (high-definition) voice, also known as wideband voice, is gaining a lot of attention these days. And for good reason: The audio technology, which lets IP phones send a far broader range of sounds over VoIP connections than traditional phones can over PSTN (public switched telephone network) circuits, vastly increases the clarity of voice calls. That can benefit businesses and individuals in tangible ways. But making good use of the technology involves more than simply buying new phones labeled HD. Here are some steps to take to gain the maximum benefit from HD voice.
Posted by muhammad abbas at 2:28 AM 0 comments
Japan and India put wireless plans in motion
Countries ready themselves for LTE and 3G spectrum auctions
Japan and India may be far removed in terms of geography, but they do have one thing in common: a drive to expand the availability of mobile broadband.
First up is the ever-ambitious Japan. The country appears to be ready to press ahead with 4G mobile services or Long Term Evolution (LTE) with the government reportedly ready to offer licenses in the second half of 2009 amidst talk that local telcos will invest up to US$10 billion.
With its track record of being an early adopter, Japan joins other European and USA telcos where the telecom sectors have announced implementation of LTE 2010 and 2012, according to budde.com, an Australian-based telecom analyst firm. “The Japanese operators are expected to face lower costs for the new networks versus 3G when they paid a premium on equipment for being early adopters,” the telecom analyst’s report added.
It is understood Japan’s four leading wireless operators, comprising NTT DoCoMo, KDDI, Softbank Mobile and eMobile, have submitted applications for the 4G licenses within the May deadline set by the Ministry of Internal Affairs and Communication (MIC). NTT DoCoMo has allocated a budget of JPY300-400 billion (US$3-4 billion) over five year and is targeting a launch as early as in 2010, while Softbank Mobile is aiming for 2011-2012 start-up with a budget believed to be around JPY100 billion (US1 billion), the report continued. KDDI, the number two mobile carrier, which initially planned to migrate from CDMA EV-DO, is now planning an LTE overlay and will operate the networks in parallel, while eMobile aims to launch its 4G service in 2011.
The telecom analyst added that the telcos will use frequencies in the 2,010 MHz to 2,025 MHz range for LTE technology. It is roughly comparable with fiber-optic networks and a number of domestic carriers intend to use existing 3G infrastructure, where they have spent JPY5 trillion (US$50 billion) to keep a lid on rollout costs.
A tradition of firsts
Meanwhile, the country’s telecom sector has reached a landmark with 3G CDMA subscribers exceeding 100 million in April 2009, announced Japan’s Telecommunications Carriers Association. Domestic subscribers were introduced to 3G CDMA in October 2001 which was launched by DoCoMo. It was followed by CDMA2000 IX by KDDI in April 2002, then Softbank Mobile’s WCDMA in December 2002 and eMobile’s HSDPA in March 2007.
Continuing its tradition of pioneering new services, DoCoMo announced that it expects to introduce a new service for cash transfer to be made by its mobile subscribers without requiring them to key in banking details. After applying online, the subscriber enters the handphone of the recipient, who must be a DoCoMo subscriber, and the amount will be charged to the sender’s phone bill. The Japanese telco is targeting a launch in summer and the amount will be limited to JPY30,000 (US$320) a month.
India: 3G is top priority
India’s telecoms minister, A Raja told the Press Trust of India, after his reconfirmation following recent election, that 3G auction is top on his priority list. The minister indicated that the Department of Telecommunications (DoT) will put up proposals for decision by the cabinet. A DoT official has been quoted in a local newspaper as saying that a ‘3G auction will definitely be held this year and sooner rather than later’.
After creating much buzz with an online auction scheduled for 16 January 2009 which was delayed to end January, the matter has been postponed indefinitely. There have been warnings about the country and businesses losing out because of delays in awarding 3G licenses. (see India’s 3G auction looks scuttled as government eyes a doubling of reserve price). Two state-owned telcos, Bhara Sanchar Nigam Ltd (BSNL) and Mahanagar Telephone Nigam Ltd (MTNL) are exempt from bidding but they are committed to pay the highest bid in the circles they operate in.
Various reasons have been ascribed for the delay, not least of which is an argument that the reserved price for licenses have been underpriced in view of earlier experience when successful domestic telcos secured multi-millions dollars from overseas telcos eyeing the fast-growing Indian market. Hence the country’s Finance Ministry reportedly wants to double the original reserved price of INR20.2 billion (US$420 million).
Posted by muhammad abbas at 2:24 AM 0 comments
Obama to create White House cybersecurity post
WASHINGTON (Reuters) - President Barack Obama said he will name a White House-level czar to coordinate government efforts to fight an epidemic of cybercrime, which even touched his presidential campaign.
"Cyberspace is real and so are the risks that come with it," said Obama in remarks Friday at the White House in which he discussed threats to the nation's digital infrastructure from organized crime, industrial spies and international espionage.
Obama said he would name an official to coordinate cybersecurity policies across the government and organize a response to any major cyber attack.
"I'm creating a new office here at the White House that will be led by the cybersecurity coordinator. Because of the critical importance of this work, I will personally select this official," said Obama. "This official will have my full support and regular access to me."
Obama said his administration would not dictate cybersecurity standards for private companies but would strengthen public-private partnerships and invest in research to develop better ways to secure information infrastructure.
He also stressed the importance of privacy. "Our pursuit of cybersecurity will not -- I repeat, will not -- include monitoring private sector networks or Internet traffic."
Holes in U.S. cybersecurity defenses have allowed major incidents of thefts of personal identity, money, intellectual property and corporate secrets. They also allowed a penetration of Obama's campaign.
"What isn't widely known is that during the general election hackers managed to penetrate our computer systems," said Obama. "Between August and October, hackers gained access to emails and a range of campaign files, from policy position papers to travel plans."
RECOMMENDATIONS
The cybersecurity review, headed by Melissa Hathaway, had urged the president to name a White House coordinator to oversee cybersecurity.
The report, requested by Obama in February, also urged the creation of a strong National Security Council directorate on cybersecurity with a privacy official attached to it.
Other recommendations included preparation of a national strategy to secure U.S. digital networks and stronger international partnerships to fight cybercrime and espionage.
The report said the government, in working with the private sector, should consider tax incentives and reduced liability in exchange for improved security, or increased liability for lax security.
Separately, the Pentagon is considering creating a command dedicated to cyberspace, under the umbrella of U.S. Strategic Command, but Defense Secretary Robert Gates had made no decisions yet, said Pentagon spokesman Bryan Whitman.
"We view cyberspace as a warfighting domain that we have to be able to operate within," said Whitman.
FBR Capital Markets analyst Daniel Ives said Friday's announcement could presage a surge in spending on security software purchased from companies like Symantec Corp and McAfee Inc, both of which have some government sales. "We've heard for so long the government was going to spend. Finally the ball is going to start rolling," said Ives.
John Stewart, Cisco's chief security officer, said some of the important next steps would be on the international stage.
"There's going to be a need for massive international cooperation in all this," he said. "This will show up in varying venues, (like) trade negotiations."
Phillip Dunkelberger, president of security company PGP Corp, said he was hoping for concrete steps to secure the U.S. digital network -- for example, some idea of what the next generation of security architecture would look like.
The cybersecurity report was posted at: http://www.whitehouse.gov/assets/documents/Cyberspace_Policy_Re view_final.pdf
(Additional reporting by Andrew Gray in Washington and Jim Finkle in Boston; Editing by Tim Dobbyn)
Posted by muhammad abbas at 2:23 AM 0 comments
The emerging business case for enhanced OTT video
Maintaining quality could enable service providers to gain new revenue streams.
Study after study by research analysts point toward a clear and, probably, irreversible trend: downstream web traffic comprised of streaming internet video is rapidly growing both in absolute volume and as a percentage of overall traffic. In fact, IDC predicts that by 2013 slightly more than half of all downstream broadband traffic will be streaming video and that the volume of this traffic will exceed all downstream traffic in 2009. In short, streaming video, frequently referred to as over-the-top (OTT) video, threatens to swamp broadband networks as it undermines business models built around linear video broadcasting (cable, DBS, and IPTV).
Broadband Service Providers (BSPs) feel understandably pressured—they are expected to expand their broadband coverage and increase the capacity of their broadband networks while the prices associated with basic broadband services steadily decline. Even with the help of broadband stimulus funds from US taxpayers, it is difficult to construct viable business models that call for pouring capital into infrastructure yielding declining per-user revenue.
Fortunately, there is an emerging business model that can generate the incremental revenue streams required to justify capital investment in broadband infrastructure. Understanding this business model requires a closer examination into the nature of streaming OTT video.
Coincident with the unparalleled growth in OTT video is a change in its composition. Whereas in the past, the bulk of OTT video traffic was composed of short-form, YouTube-ish clips viewed on user laptops, PDAs, and smartphones; increasingly it is made up of long-form TV episodes and movies viewed on flat-panel televisions, sometimes in high definition. Obviously, in addition to fueling the massive growth in the first place, this shift in the composition of streaming OTT video carries with it stringent quality requirements. Many may have tolerated low resolution video clips that frequently timed out (i.e., froze) when the video was free and the viewing device was the computer. But when they pay for the content and display it on their HDTVs, these same viewers become extremely sensitive to anything other than crystal-clear video and audio fidelity.
Unfortunately, the best-effort traffic management principles associated with the Internet in general and broadband in particular are not well suited to handle traffic with such stringent service quality requirements. Recognizing this, virtually all video content storefronts (e.g., Apple iTunes, Amazon Video on Demand, Hulu, Netflix, etc.) pay content delivery networks (CDNs) to bypass the somewhat rickety internet and deliver high-value video traffic directly to BSP peering points. CDNs, with well-managed, high-capacity backbone networks and distributed storage points, are able to provide video storefronts with explicit service level agreements (SLAs), something the internet will never be capable of doing.
But video content storefronts still face a challenge: the CDNs do not deliver traffic to the end user, they deliver it to the BSP. The traffic then has to traverse another network before reaching the ultimate end user. To CDNs this did not initially appear to be a problem as they believed they were “close” to the end user, measured in terms of router hops (in fact CDNs were born primarily as a way to minimize router hops between web browsers and web servers). And, indeed, the point at which CDNs hand off traffic to BSPs is often no more than one or two router hops away from the end user. But while the end user may appear “close” at layer three, he/she is still somewhat distant at layer two. The entire broadband aggregation network as well as the broadband access facility itself is a large, multi-hop, layer two network that is often heavily congested. In most cases, that entire network is governed by first-come-first-serve, best-effort traffic management mechanisms (or lack thereof). Obviously this is not an environment well suited for streaming OTT video.
Arguably, this problem can only be rectified by the BSP; they own and operate the only portion of the network between video servers and digital video players that does not carry an explicit SLA. It stands to reason, therefore, that if BSPs could guarantee service quality for specific OTT video streams and provide SLAs on those streams, various parties might be willing to pay them for the service. Here there are two potential revenue sources for the BSP: the end user, paying for “premium internet TV,” or the CDN/content storefront paying for an explicit SLA. (See Figure 1.)
Figure 1.
Whether the end user pays for a superior OTT viewing experience, one that is nearly indistinguishable from a locally attached DVD player, or the CDN/content storefront pays for a guaranteed SLA, the technical requirements placed on the BSP are identical:
1. Identify OTT video “sessions.”
2. Determine policy for each session (e.g., has this subscriber requested premium OTT service?).
3. Determine availability of network resources (i.e., don’t make service quality guarantees that can’t be fulfilled).
4. Allocate specific amount of bandwidth (above and beyond basic high-speed internet access) for the duration of the video.
5. Monitor video quality during session (optional but highly desirable).
6. Release dedicated bandwidth and generate settlement records at session termination.
How these functions are implemented will be service provider dependent. In many cases they can be implemented within a single device, in other cases multiple devices may be required. But in any case, these functions are required in order to recognize and handle streaming OTT video in a manner that delivers the viewing experience subscribers are looking for—and will pay for.
Most questions service providers have on this topic deal not so much with technology but rather with consumer and regulatory acceptance. Fortunately, there are positive indications on these fronts as well.
From the standpoint of consumer acceptance, surveys conducted by IDC have consistently shown a substantial subset of the population willing to pay incremental fees for an OTT video viewing experience that approximates that of a locally attached DVD player. The size of the subset obviously depends on price and other services that are bundled with the service but, contrary to conventional wisdom held by many BSP marketing departments (“customers won’t pay for anything”), there is a clear willingness to pay for better video.
One factor to keep in mind is how premium OTT video services are marketed. If the service is billed merely as a better way for broadband to handle streaming video, take rates might peak in the 20-25% range. However, if the service is billed as a TV video on demand (VOD) service and bundled with a digital video player (e.g., Roku), take rates may be much higher. The reason is that in the former case, the subscriber may believe that broadband should already deliver such capabilities. In the latter case, the subscriber views the service as a TV/movie service, not a broadband service.
A final point is the regulatory treatment of these services. Many analysts covering the FCC believe, and recent FCC reports seem to concur, that these services would be acceptable from a regulatory standpoint. The reason is that these services, by and large, have nothing to do with the internet. Rather, they take traffic from CDNs and deliver it over broadband using discrete “broadband video channels” with specific bandwidth allocations and separated from the high-speed internet (HSI) access channel using protocol mechanisms. The HSI channel is unimpaired by streaming video. In this regard it is similar to the handling of cable TV and IPTV.
With more and more consumers “cutting the cable cord” and obtaining their video on-line, the business case for premium OTT video services is especially compelling; the capital cost to offer the service (even with bundled digital video players) is modest compared to IPTV, and consumer demand is high and growing. Regulators appear, rightly, focused on stamping out bad behavior rather than obstructing services consumers are clamoring for. In short, all lights appear green.
Posted by muhammad abbas at 2:21 AM 0 comments
Verizon offers pay-as-you-go hosting service
NEW YORK (Reuters) - Verizon Communications Inc on Thursday unveiled a pay-as-you go hosting service for corporate customers looking to save money by buying only as much computing capacity as they need at any given time.
Verizon is the latest U.S. telecom to jump on the bandwagon of so-called "cloud computing" services after AT&T Inc said last month that it would offer Web-based storage services for enterprises.
Gartner Research estimated earlier this year that global revenues from cloud computing and storage, or the use of the Web to access those services at remote data centers, will climb 31 percent to $3.4 billion this year.
The fledgling field is led by Internet pioneers Amazon.com Inc, Google Inc and Salesforce.com Inc, but more established technology companies like Verizon, AT&T, IBM and Microsoft Corp are introducing new services this year in a bid to catch up before sales start to boom.
Verizon said that its new "computing as a service" offering can provide customers with remote hosting capacity for applications such as retail websites within an hour after they put in an order.
A time lag of anywhere from four to eight weeks would have been more typical using older technologies.
Verizon, which had promised in February to launch on-demand hosting this summer, said some customers would likely end up switching from traditional hosting to on-demand services because the new products cost less.
"There will be some of that," Michael Marcellin, vice president of global managed solutions told Reuters on Tuesday when asked if the cloud service might cannibalizing the traditional business.
"As people look at this delivery model this is very compelling," he said.
(Reporting by Sinead Carew in New York and Jim Finkle in
Boston; Editing by Edwin Chan and Carol Bishopric)
Posted by muhammad abbas at 2:20 AM 0 comments
Eight Asian telcos to build subsea fiber optic network
Higher bandwidth capacity and redundancy will benefit region known for its subsea earthquakes.
A new submarine fiber optic network, known as Asia-Pacific Gateway (APG), is in the offing with a memorandum of understanding signed recently by eight of Asia’s biggest telcos. Targeted to be ready for service in 2011, the 8,000 kilometer network with a minimum design capacity of four terabits per-second will link the region’s growing economies.
As demand continues to increase for high bandwidth communication, including enterprise network services and the Internet, the proposed fiber optic network will offer alternative communication routes and nodes. In the event of accidents or subsea earthquakes, it will minimize the impact of a breakdown in services provided by existing regional fiber optic network.
Using the latest Dense Wavelength Division Multiplexing (DWDM) technologies, the proposed APG network is designed to provide a high-degree of interconnectivity with existing and planned high bandwidth communication systems.
The APG network will connect Japan, Korea, mainland China, Taiwan, Philippines, Hong Kong, Vietnam, Thailand, Malaysia and Singapore. The signatories to the agreement for the development of the gateway are China Telecom, China Unicom, Chunghwa Telecom, NTT Communication (Japan), Vietnam Post and Telecommunications (VNPT), Korea Telecom (KT), Philippines Long Distance Telephone (PLDT) and Telekom Malaysia. The telcos will jointly finance and own the APG network.
Three APG telcos also in AAG network
Three of the telcos in the APG consortium – PLDT, VNPT and Telekom Malaysia - are also participating in the Asia-America Gateway (AAG) project which spans 20,000 kilometers to connect Southeast Asia with the US.
Aside from Bayan Telecommunications and Eastern Telecommunications from Philippines, the other members of the AAG consortium are Brunei government , AT&T (US), Bharti Airtel (India), Communications Global Network (UK), Pacific Communication (Cambodia), CAT Telecom (Thailand), PT Indosat (Indonesia), PT Telekomunikasi (Indonesia), Starhub (Singapore), Telecom New Zealand, Telstra (Australia), while participants from Vietnam are Saigon Postel and Viettel.
Originally scheduled for completion in the last quarter of 2008, delays in construction has led to revised deadline till August this year. The US$550 million the AAG network will link Southeast Asian countries, namely Malaysia, Singapore, Thailand, Brunei, Vietnam, Hong Kong and Philippines with US territories in the Pacific, such as Guam and Hawaii, and the west coast.
In another development, NTT Communications announced its acquisition of Pacific Crossing without providing financial details. Pacific Crossings operates the 21,000-kilometer fiber optic network, PC-1, spanning Japan and the USA with two points in each country and has capacity of 3.2 terabits per second.
Posted by muhammad abbas at 2:18 AM 0 comments
Tekelec turns SMS deployments on their head
SMS router leverages FDA to drive lower cost per SMS message.
In Q1 09, Verizon Wireless reported that its customers sent or recieved an average of 1.4 billion text messages each day, totaling more than 127 billion text messages. What's more, Verizon Wireless customers sent nearly 2.1 billion picture and video messages and completed 48.6 million music and video downloads during the quarter.
The traffic traversing Verizon Wireless’ network along with that of other large mobile players such as AT&T, Sprint and T-Mobile has led Frost and Sullivan to forecast that global SMS message volume is growing at a CAGR of 15.6 percent (2007-2011). However, SMS revenues are only growing at a CAGR of 5.9 percent over the same period.
Figure 1. Global SMS traffic.
Source: Tekelec
Figure 2. Global SMS Revenues.
Source: Tekelec
Still, the increased momentum around SMS is driving wireless operators to rethink how they handle SMS in their respective networks.
Enter Tekelec and its SMS router. With the goal of helping operators handle SMS traffic while keeping costs down, the vendor claims it can achieve these goals by using a First Delivery Attempt (FDA) method for Mobile Originated (MO)/Application-Originated (AO) messages.
By incorporating flexible routing rules the router enables traffic to be routed on any SMS parameter: sender, recipient, SMSC address, data coding schemes, mobile switch center and message content.
Alan Pascoe, senior product marketing manager for Tekelec, argues that the problem with legacy SMSC was in their overall approach to distributing and storing SMS messages. And Pascoe should know. Prior to taking his post Tekelec, Pascoe previously was an SMS service designer for O2 wireless.
“Routers have been around for a number of years,” he said. “A traditional SMSC leverages a Store and Forward approach, but what a router does is completely turning it around and goes forward and then store because chances of me sending you a text message in a FDA format are a lot higher these days than it was five to 10 years ago.”
Cap and grow
Given the large investments wireless operators have already made in their existing SMSC networks, the SMS router and the overall Tekelec SMS network concept allow the service provider to cap and grow their existing SMSC investments.
In so doing, the SMS Router can help service providers extend their existing investments while making a transition to the next-gen router platform. What’s more, the SMS router supports SS7 and SIGTRAN signaling as well as CAMEL and DIAMETER billing protocols.
Figure 3. Tekelec's SMS Router.
Source: Tekelec
FDA is the key enabler to help wireless operators make the the legacy-to-next gen SMS network transition.
By using FDA, the SMS router can free up the legacy SMSC capacity and enable the operator to handle about 15 percent of stored message traffic until it can be delivered. An existing SMSC is required to only handle the remaining 15 percent that require storage for future delivery.
To provide a better picture of what benefits they can gain by making the switch to a next-gen SMS network, Tekelec developed a business case in collaboration with Frost and Sullivan to help wireless operators profitably deploy the SMS router and adopt its overall SMS network concept. These business cases include four distinct elements: SMS Offload SMS First Delivery Attempt; SMS Offload – Voting Traffic; Blocking SMS MO Spoofing; and the combination of all three in one SMS set.
In a SMS offload application, Tekelec claims a wireless operator could potentially reduce the current SMSC load by 80-90 percent. This load reduction, however, depends on FDA success ratio.
Tekelec claims that this cap and grow strategy allows a wireless operator to realize 60 percent in savings over a traditional SMSC. Opting for a SMS Network with a SMS router would cost only $0.9 million versus spending $2.2 million on a new SMSC.
“The fourth benefit of the SMS network concept is where a wireless operator can combine the offload and the voting offload applications,” Pascoe said. “You can pretty much use the same equipment to deliver both of those savings from the voting and from the first delivery point of view.”
Breaking SMS bottlenecks
In bridging the legacy and next-gen SMS worlds, the SMS router can help service providers overcome traditional SMS capacity bottlenecks by leveraging load balancing and throughput control techniques. Wireless operators can support high-end SMS customers with the ability to prioritize traffic based on premium SLAs.
One application where traffic flow prioritization will come in handy is televoting.
Televoting is not just vendor speak, however. Verisign’s Mobile Media Division, which provides Internet infrastructure services, predicts that mobile voting and other interactions with the government via SMS will become commonplace.
However, one of the common problems wireless operators have in dealing with voting applications is that existing SMSCs can’t handle voting traffic spikes and the ability balance loads throughout the SMS network.
“Using a router approach to conduct voting offload has some distinct advantages over using an SMSC,” said Pascoe. “Typically, this sort of traffic is delivered in a huge spike, but with an SMSC you need to buffer all of those messages which can overload meaning that messages are dropped because the buffers fill the filter quickly.”
In a televoting application, the SMS router can manage peak traffic control while preventing legacy SMSC overload by delivering traffic directly to the application. This means that SMS traffic can be delivered to a part of the subscriber’s handset.
Pascoe adds that by untangling bottlenecks, the operator can cut opex costs by ensuring customer happiness.
“Most of the savings you’ll actually get is in the brand awareness and the reduction of customer care complaints,” Pascoe said. “When you look into an operator’s business model in terms of customer care complaints, they usually attribute a dollar value to each particular complaint.”
Posted by muhammad abbas at 2:16 AM 0 comments
Nokeena pushes boundaries of online video delivery
Sets focus on enhancing user Quality of Experience.
Sets focus on enhancing user Quality of Experience
Nokeena
In this latest Telecom Engine Audiocast, Rajan Raghavan, CEO and co-founder of Nokeena Networks talks to us about the company’s debut and how it can bring a value-added service experience to the burgeoning online video business.
Raghavan addresses the following questions in this Audiocast:
• ABI Research believes the number of viewers who access video via the Web will quadruple in the next few years, reaching at least 1 billion by 2013. To start tell me your feeling about the online Over the Top (OTT) video market?
• How does Nokeena approach the OTT video opportunity?
• Why is the Nokeena product beneficial for content owners, media publishers and distributors?
• How is Nokeena different from other streaming software available currently available on the market?
• The past few years have not been the easiest economic times for new startups, but what makes 2009 the right time for Nokeena to enter into this market?
Posted by muhammad abbas at 2:13 AM 0 comments
Nokeena pushes boundaries of online video delivery Sets focus on enhancing user Quality of Experience
Posted by muhammad abbas at 2:13 AM 0 comments
Eight Asian telcos to build subsea fiber optic network
Higher bandwidth capacity and redundancy will benefit region known for its subsea earthquakes.
A new submarine fiber optic network, known as Asia-Pacific Gateway (APG), is in the offing with a memorandum of understanding signed recently by eight of Asia’s biggest telcos. Targeted to be ready for service in 2011, the 8,000 kilometer network with a minimum design capacity of four terabits per-second will link the region’s growing economies.
As demand continues to increase for high bandwidth communication, including enterprise network services and the Internet, the proposed fiber optic network will offer alternative communication routes and nodes. In the event of accidents or subsea earthquakes, it will minimize the impact of a breakdown in services provided by existing regional fiber optic network.
Using the latest Dense Wavelength Division Multiplexing (DWDM) technologies, the proposed APG network is designed to provide a high-degree of interconnectivity with existing and planned high bandwidth communication systems.
The APG network will connect Japan, Korea, mainland China, Taiwan, Philippines, Hong Kong, Vietnam, Thailand, Malaysia and Singapore. The signatories to the agreement for the development of the gateway are China Telecom, China Unicom, Chunghwa Telecom, NTT Communication (Japan), Vietnam Post and Telecommunications (VNPT), Korea Telecom (KT), Philippines Long Distance Telephone (PLDT) and Telekom Malaysia. The telcos will jointly finance and own the APG network.
Three APG telcos also in AAG network
Three of the telcos in the APG consortium – PLDT, VNPT and Telekom Malaysia - are also participating in the Asia-America Gateway (AAG) project which spans 20,000 kilometers to connect Southeast Asia with the US.
Aside from Bayan Telecommunications and Eastern Telecommunications from Philippines, the other members of the AAG consortium are Brunei government , AT&T (US), Bharti Airtel (India), Communications Global Network (UK), Pacific Communication (Cambodia), CAT Telecom (Thailand), PT Indosat (Indonesia), PT Telekomunikasi (Indonesia), Starhub (Singapore), Telecom New Zealand, Telstra (Australia), while participants from Vietnam are Saigon Postel and Viettel.
Originally scheduled for completion in the last quarter of 2008, delays in construction has led to revised deadline till August this year. The US$550 million the AAG network will link Southeast Asian countries, namely Malaysia, Singapore, Thailand, Brunei, Vietnam, Hong Kong and Philippines with US territories in the Pacific, such as Guam and Hawaii, and the west coast.
In another development, NTT Communications announced its acquisition of Pacific Crossing without providing financial details. Pacific Crossings operates the 21,000-kilometer fiber optic network, PC-1, spanning Japan and the USA with two points in each country and has capacity of 3.2 terabits per second.
Posted by muhammad abbas at 2:08 AM 0 comments
NAB: Digital Rapids enhances C2, announces MediaMesh
Tuesday, April 21, 2009
Exhibiting at NAB 2009 in Las Vegas this week, encoding vendor Digital Rapids announced new enhancements to its content delivery software and a new media distribution appliance, as well as other news. The company's Digital Rapids C2 data delivery software now includes support for unicast, multicast and hybrid distribution models and is targeted at overcoming traditional ip network performance issues. The new MediaMesh RX appliance is aimed at receiving and repurposing content from centralized distribution sources to enhance syndicated ad delivery of ad spots through long-form content to broadcast affiliates and distribution partners.
Posted by muhammad abbas at 1:44 AM 0 comments
Ifbyphone offers carriers value-add services via SIP trunking
Ifbyphone is announcing this week that it is offering all of its Phone 2.0/mashup style value-added services as white label offerings via SIP trunking, enabling softswitch-equipped carriers to sell phone automation services on top of their VoIP/SIP offerings.
"As transport prices continue to collapse, CLECs and regional telcos and hosted providers need applications to add value," Ifbyphone CEO Irv Shapiro said. "The real power of SIP is a signaling protocol to allow a telephony company to gain access to applications. We're making our applications available to anyone, any carrier to use those applications in their customer base, just as someone can [incorporate] a web service in a web page."
Carriers using softswitches can add Ifbyphone features like full-function IVR, Call Queuing, Call Tracking and "Find Me" Call Forwarding, and complete applications such as Voice Broadcasting, Store Locator, and Lead Distributor without having to do the capital outlays for building their own applications.
The key in implementing the service is leveraging SIP trunking, allowing carriers to connect to Ifbyphone and route calls to what the company is calling "Smart ports" within the Ifbyphone platform, while also retaining origination and termination. Small to medium-sized carriers can access the features and services on a pay-as-you-go model, rather than investing significant dollars in capital expenditures and programming.
Shapiro outlined three business scenarios for implementing the services, with a $10,000 one-time setup fee for setting up the SIP trunk attached to all of them. For a limited number of customers needing IVR-style features, a carrier could route calls to Ifbyphone and simply get charged back on a per-minute basis for usage of those features. Larger usage would necessitate the rental of dedicated applications ports, with some capability to burst/oversubscribe on the first month to balance usage correctly -- but you have to pay for what you use. Finally, Ifbyphone is willing to work out a partnering relationship with a larger CLEC of sufficient size with a revenue share model.
While voice is the current focus, Sharpiro said any SIP applications could be delivered via the same model. "We're the first SIP applications warehouse on the net," he remarked.
Ifbyphone is certainly going through boom times with its current business. The company has reportedly growing at a rate of 10 to 15 percent per month over the last 20 months, with the first quarter of 2009 showing 50 percent growth over Q4 2008.
Posted by muhammad abbas at 1:43 AM 0 comments
4G race is more complex than many realize
Though a Nokia executive recently hinted that WiMAX is bound for the scrap-heap of history, the reality is much more complicated, according to Maravedis analyst Robert Syputa. What is done with the network, he contends, "is becoming more important to the end customer than the technology that runs it," meaning that "a shift in openness is needed, and will increasingly be demanded." Syputa contends that WiMAX has acted as a "Trojan horse" for the wireless industry by opening up a market that has traditionally hidden behind a walled garden of exclusivity.
Posted by muhammad abbas at 1:42 AM 0 comments
YouTube signs premium content deals
YouTube continued its push for premium content Thursday announcing several content deals with Hollywood studios that will bring scores of movies and TV shows to the Google-owned site. The Goog mentioned in the release that a paid model for the premium content could be floated, which would be a marked change for a company whose bread and butter has always been free, UGC video content.
MGM, Lionsgate, CBS, and Sony all agreed to send content to YouTube as part of the agreement, and advertising revenue generated from ad placements around this content will be split between the studios and YouTube.
A YouTube spokesman also disavowed the Credit Suisse report released last week that posited that YouTube was set to lose close to $500 million in 2009, saying it was assumptive rather than fact based.
Posted by muhammad abbas at 1:42 AM 0 comments
NAB: GulfPines chooses Falcon, Verimatrix
Mississippi-based GulfPines Communications is using the Falcon IP/Complete IPTV delivery system and the Verimatrix Video Content Authority System for IPTV, the companies announced at NAB 2009 in Las Vegas, where the two vendors are demonstrating the Falcon system.
Falcon IP/Complete, which is French video technology firm Thomson's U.S. partner, and Verimatrix long have support an integrated, end-to-end solution for content aggregation and delivery that appeals to regional and rural telcos looking for wholesale help. The companies also recently worked with GTel Teleconnections in Germantown, N.Y.
GulfPines President Charles F. Fail said in a press release, "We selected the integrated Falcon IP/Complete and Verimatrix solution because it is the only one that meets all our needs. The solution is a standards-based IPTV solution featuring all elements from a single provider, from satellite to the set-top box. Because it is completely integrated, the solution offers flexible delivery for optimal efficiency. With the Falcon solution, we will make one call - 24/7 - for any service or technical issue. This will allow us to provide the excellent service our customers expect. It will also enable us to promptly offer new services in the future."
Posted by muhammad abbas at 1:41 AM 0 comments
Nortel soldiers on with carrier VoIP
As it continues to work through an ugly bankruptcy process, Nortel took time to tout the addition of four more regional carriers as carrier VoIP customers. The company reports it has a total of 135 Communications Server (CS) 1500 customers globally.
The customers - Arkwest Communications, Dakota Central Telecommunications, Hancock Telephone and Venture Communications Cooperative - aren't exactly what we'd call the largest carriers of note these days. Arkwest is providing service within Yell County, Ark., for example, and offers both phone and IPTV services. Dakota and Venture are both Nortel DMS-10 switch customers and have added the CS 1500 to enable new services such as click-to-call, PC access to voice messages, and end-user web portals.
Hancock may be the biggest customer win out of the bunch. Nortel is providing both the CS 1500 softswitch along with "all" the products and professional services required to design, build, and manage the entire IP network.
If Nortel survives as an independent, albeit shrunken, entity, it is likely carrier VoIP and unified communications would be at the heart of the company that emerges from bankruptcy.Posted by muhammad abbas at 1:40 AM 0 comments
Latest Offer From Virgin: Pink Slip Protection Plan
Virgin Mobile is offering to forget about your phone bill if you get fired…but only for a while.
Current prepaid Monthly Plan customers will have an open enrollment period through June 30. In order to apply for Pink Slip Protection
"With the unemployment rate rising, the fear of job loss or salary reductions have made consumers watch every dollar," said Dan Schulman, Virgin Mobile's CEO, in a statement. "These issues tend to impact our prepaid base more than many wireless users, so we hope this program can offer some peace of mind to our customers."
In additon. new customers choosing Virgin Mobile USA's new Monthly Plans Without Annual Contracts - $29.99, $39.99 and Totally Unlimited for $49.99 - will automatically be enrolled in the Pink Slip Protection program. Still, read the contract carefully, as additional conditions do apply.
Posted by muhammad abbas at 1:37 AM 0 comments
Ballmer, IBM reportedly surprised by Oracle-Sun deal
Reporters caught up with Microsoft CEO Steve Ballmer in Moscow to get his take on Oracle's deal to buy Sun Microsystems for $7.4 billion.
But apparently Ballmer, who is rarely at a loss for words, didn't exactly have a sound byte at the ready.
"I need to think about it," Ballmer told reporters in Moscow, according to Reuters. "I am very surprised."
I'm hearing that Ballmer wasn't the only one surprised by Monday's deal. According to a source of mine, IBM hadn't given up on purchasing Sun and was blindsided by Oracle's move.
Oracle is, of course, one of Microsoft's chief rivals in the database and business applications space--a fact that Ballmer highlighted in an interview in February. Sun is also a longtime rival, although the two companies have had a technology partnership in recent years stemming from their settlement of legal hostilities back in 2004.
I imagine we'll hear far more from Ballmer and Microsoft in the coming days and weeks.
Posted by muhammad abbas at 1:37 AM 0 comments
Oracle gets Sun for $7.4 billion, MySQL for $0
Back in the early days of computing, there was no such thing as a "software vendor." Companies like IBM sold hardware/software integrated solutions and, really, software was developed simply to sell the value of the hardware.
With Monday's announcement that Oracle is acquiring Sun for $7.4 billion, however, Oracle is signaling its own "iPod moment," seeking to compete with Hewlett-Packard, IBM, and others in integrated hardware/software systems.
It's a bold move, and not for the faint of heart. But then, no one would ever accuse Oracle of being faint-hearted.
"I believe this is the first step down a different path," Sun CEO Jonathan Schwartz said in an e-mail to Sun employees, except that it's not, as Gordon Haff points out in a post on CNET.
What is new in the deal is that Oracle finally gets its wish to own MySQL. In 2007 Oracle offered as much as $850 million for MySQL, the third of its offers for the open-source database company.
This time, Oracle effectively got MySQL for free, as the valuation for Sun almost certainly wasn't raised much by its MySQL asset, acquired in 2008 by Sun for $1 billion.
What Oracle will not want, however, is for its customers to get MySQL for free.
Importantly, Oracle's new "systems" approach gives it the ability to digest a host of open-source projects like MySQL that might otherwise struggle to make money, and monetize them heavily by burying them in hardware "systems." It's a smart move driven by a company that knows that open source as a religion faded, and that open source as a key driver of innovative IT is just beginning.
It does, however, potentially give Oracle an antitrust problem in MySQL, as ZDNet's Dana Blankenhorn posits. MySQL's market share in the enterprise database market is negligible, but its share of the exploding Web database market is dominant and exploding.
While I don't expect the U.S. Justice Department or Federal Trade Commission to launch an antitrust action against Oracle relative to MySQL, it's important to note that this acquisition makes Oracle the clear behemoth in databases, past (enterprise) and future (Web).
Ultimately, however, this acquisition is not about MySQL. At least, not yet.
It's about hardware/software systems, primarily, and to the extent that software is involved, it's about Java, as called out by Oracle CEO Larry Ellison. Over time, the MySQL component will become increasingly important, but for now this Sun acquisition gives Oracle exceptional control over integrated solutions for its customers, as well as a software portfolio with massive potential.
The industry just changed. Oracle raised the stakes of the game. The new ante to get into the game is integrated hardware/software systems, and as IBM, Microsoft, and Oracle increasingly demonstrate, open-source software plays an increasingly important role in feeding these systemsPosted by muhammad abbas at 1:36 AM 0 comments
Oracle buys integration challenge along with Sun
Through one important piece of corporate computing jargon--"integration"--Oracle has found a justification for its $7.4 billion acquisition of Sun Microsystems. Now it will have to convince historically skeptical customers, too, that the idea makes sense.
The all-cash acquisition agreement--announced Monday, costing Oracle $5.6 billion with Sun's cash factored in, and expected to close this summer--puts the innovative but financially bumbling Sun out of its misery after IBM's move to buy it fell apart earlier in April. The way to fit Sun's technology into Oracle's business model goes back to a project called Raw Iron that's more than a decade old.
Raw Iron ideas placed application software front and center while demoting the server hardware itself and the operating system to a subordinate role. The customer who needs some database software need hardly know what's going on under the covers.
What's smart about the approach is that it lets Oracle profit from Sun's diverse technology--which includes not just servers but also open-source software including Java and the MySQL database that Oracle already tried to buy years ago--without disrupting its own business too much.
Oracle signed a Raw Iron partnership with Dell and worked on it with Sun, IBM, and then-independent Compaq. With Sun's technology in house, one major challenge of those deals--who's in the driver's seat--evaporates with Sun a part of Oracle. There's no longer any question about which partner owns the customer relationship, which services the technical support contracts, and how the sales revenue is divvied up.
Will server appliances work this time?
Here's the rub, though. Raw Iron, along with the related concept of server appliances that arrived a few years later, was a marketplace dud.
Customers appreciate integrated technology to an extent, but Raw Iron and server appliances quietly submerged beneath the waves. Also worrisome for Oracle is the failure of one of its integration ideas, Unbreakable Linux. Customers by and large ignored this Oracle attempt to offer its own version of Linux, a clone of market-leading Red Hat's product.
Oracle Chief Executive Larry Ellison is a true believer, though, making the sales pitch in the company's official statement:
"Oracle will be the only company that can engineer an integrated system--applications to disk--where all the pieces fit and work together so customers do not have to do it themselves," Ellison said. "Our customers benefit as their systems integration costs go down while system performance, reliability, and security go up."
He does have a point. Sun has always focused centrally on the database market, and it has compelling technology assets for it that it hasn't been able to sell effectively: its current Niagara and the delayed higher-end Rock multicore processors, its Solaris operating system, and its Thumper storage servers with tremendous built-in data capacity.
And selling products at this high level of integration gives Oracle a way to ingest Sun's considerable open-source assets--among them Java, MySQL, Solaris, GlassFish, NetBeans--without too much indigestion. It might even give Oracle some incentive to be more active with the open-source community it's mostly kept at arms' length.
The once and future server market
Another issue, though, is that server appliances are to an extent an artifact from an earlier era, when companies bought and managed discrete systems. That remains a big business, but it's at odds with two important trends gaining steam in the industry.
First is virtualization, chiefly through EMC's VMware software. This lets a single server run multiple operating systems, with the software collection moving flexibly from one physical machine to another as work load demands shifted. By breaking the hard link between hardware and software, virtualization undermines the integration sales pitch and inserts a third party's technology between the server and its higher-level software.
Second is cloud computing, where applications run on central servers on the Internet rather than in a company's own confines. Cloud computing takes many forms, but from Oracle's perspective, an excellent example is Salesforce.com, whose sizable cloud-computing service competes directly with Oracle's Siebel business for customer relationship management chores such as tracking who bought what and when their warranty is up for renewal.
But having Sun's hardware assets in-house gives Oracle more flexibility to adapt to cloud computing on its own, in particular through Sun's recently relaunched Network.com cloud computing infrastructure.
Financial complications
Sun Chairman and co-founder Scott McNealy and even more so CEO Jonathan Schwartz likely are breathing a sigh of relief. Sun's stock plunged after IBM's attempted acquisition of Sun fell apart, but with an Oracle acquisition offer also on the table, it's now clear why Sun could play chicken with IBM then issue a statement about the board's faith in Schwartz after IBM walked. On Monday, Sun's stock surged 36 percent to $9.07 in mid-morning trading.
Oracle seemed eager to justify the price, arguing it will improve Oracle's earnings per share significantly and that it will help the company more than earlier massive acquisitions.
"We expect this acquisition to be accretive to Oracle's earnings by at least 15 cents on a non-GAAP basis in the first full year after closing," Oracle President Safra Catz said in a statement. "We estimate that the acquired business will contribute over $1.5 billion to Oracle's non-GAAP (generally accepted accounting principles) operating profit in the first year, increasing to over $2 billion in the second year. This would make the Sun acquisition more profitable in per-share contribution in the first year than we had planned for the acquisitions of BEA, PeopleSoft, and Siebel combined."
It should be noted that although Oracle did surprisingly well integrating BEA Systems, PeopleSoft, and Siebel--despite having its own directly competing products in each case--but also that those were software companies. Sun is much more, and the future of its hardware business is cloudy.
Standalone server sales? At Oracle?
Oracle can sell software-hardware package deals and build a Sun-based cloud service, but how well will it serve customers who want to run their own machines with their own software? It's likely those companies will look elsewhere unless Oracle can show it truly wants to be a full-fledged hardware company.
It's not clear how much Oracle's financial projections rely on the strength of that standalone server business. Historically, selling software has much nicer profit margins. It's also not clear how much of a hit Oracle is expecting to its current software business after a Sun acquisition turns present allies into rivals.
Oracle also must deal with the fact that server makers Hewlett-Packard, Dell, and IBM could become less eager to promote Oracle's software. Because massive database servers are so complicated, Oracle has relied on tight sales, support, and marketing partnerships, and those companies could lose enthusiasm if their server sales force starts seeing Oracle's offering competitive bids.
At least Oracle's acquisition faces less of an antitrust hurdle than IBM's. Big Blue and Microsoft offer viable database competitors to Oracle's and Sun's, and Oracle buying Sun mean there would still be major server makers rather than the three left standing had an IBM acquisition gone ahead.
So Sun shareholders and government officials likely will be convinced of the merits of the deal. The ultimate success, however, will depend on how Sun and Oracle's customers see it.
Posted by muhammad abbas at 1:35 AM 0 comments
Video game industry sales finally take a hit
March revenues for the video game industry dropped 17 percent from a year ago, the NPD Group reported Thursday, the first time in the current recession that the business has seen sales fall.
For the month, the analyst firm reported that the industry turned in total sales of $1.43 billion, down 17 percent from $1.72 billion a year earlier. Hardware sales were down 18 percent, while software was down 17 percent.
But while the numbers look poor, NPD analyst Anita Frazier said she attributed some of the drop to the vagaries of the calendar.
"While it might be tempting to jump to the conclusion that the sky is starting to fall on the video games industry given this month's results, it's important to remember that two very big things are different this year than last," Frazier wrote in a note accompanying NPD's report. "First, Easter fell in March last year whereas it fell in April this year, and last March included the release of Super Small Bros.: Brawl, which went on to become the fourth best-selling game in 2008."
Perhaps, but one game's fortunes are unlikely to be enough to turn around an entire industry, especially given that hardware sales dropped about the same as overall revenues.
Overall sales were also down 2.7 percent from February's $1.47 billion, and each of the six hardware platforms NPD tracks--Sony's PlayStation 3, PS2 and PSP, Microsoft's Xbox 360 and Nintendo's Wii and DS--had lower sales in March than in February. The PS3 and Wii led the drops, with 21.0 percent and 20.2 percent lower sales, respectively.
By comparison, the Xbox saw its sales fall between February and March, but less than the PS3 and Wii, and Frazier reported that Microsoft's console was the only one with good year-over-year news.
"While it's not unusual for March hardware sales to be lower than February," Frazier wrote, "I thought we'd see higher unit sales on most platforms. The Xbox 360 was the only platform to achieve a year-over-year sales increase."
Frazier also said Nintendo's numbers were noteworthy, given the effect of Super Smash Bros.: Brawl on the company's March 2008 sales.
"Wii and NDS hardware sales remained brisk, taking the top two spots for (March 2009) in hardware unit sales," she wrote. 'It's important to keep in mind that the (game's) effect from last year impacted hardware sales as well, so while the year-over-year comps are down for the Wii, the sales are still impressive."
Still, the Wii--the darling of the video game industry media since its surprise success became almost institutionalized--has seen some negative press recently.
In March, for the first time, the PS3 outsold the Wii in Japan, and many observers wondered if that milestone indicated that Nintendo's console's dominance had finally come to an endPosted by muhammad abbas at 1:33 AM 0 comments
Swedish antipiracy law: Traffic down, ISP rebels
Immediately following the enactment of a new Swedish antipiracy law on April 1, Internet traffic in Sweden plummeted--and it has yet to return to prior levels.
According to Netnod, an organization that measures Internet traffic on access points between Swedish and international networks, traffic went down from average data speeds of about 160 gigabits per second to about 90Gbps and has remained so since the day the new law went into effect.
Netnod has declined to make the connection between the new antipiracy law and the traffic drop since it only measures traffic without identifying what sort of activity is behind the numbers. Other large Internet service providers won't release their numbers.
But Jon Karlung, CEO of Bahnhof , a comparatively small, outspoken broadband operator that has expressed opposition to the new antipiracy law, explains what it has seen.
"Almost half the Internet is gone," Karlung told CNET News over the telephone from Sweden. "Likely, it is the torrent traffic that has declined, but I cannot say whether this traffic is legal or illegal."
The so-called IPRED originated from the European Union's "International Property Rights Enforcement Directive." IPRED stipulates that property rights holders can take their grievances to a court, which will examine the evidence and decide whether the name of a holder of an IP address will be released.
The guilty verdict in the high-profile Pirate Bay trial, announced earlier Friday, was not affected by IPRED, since only file sharing done after April 1 is being affected by the new law. But copyright holders have already turned to the new law in an attempt to stop file sharers.
On the law's first day, five Swedish audio book publishers went after an alleged illegal file sharer in court, in hopes of revealing the identity of the person behind a particular IP address.
And two days after the law came into force, two men were arrested, allegedly for sharing copyrighted files and administering a "rip box," which removes copy protection on purchased films and music. International police were involved in the arrests.
But now Bahnhof says it won't release the names connected to IP addresses, since its understanding of an earlier law based on another EU directive is that ISPs must erase traffic data for the sake of the subscribers' integrity.
"Our ambition is not to store any traffic data," Karlung said, adding that as a consequence, "Bahnhof cannot provide information on alleged piracy to courts, since we do not have the information stored. Thus IPRED becomes effectless."
Bahnhof's interpretation of the earlier law gets support from the Swedish Post and Telecom Agency, a regulatory body that's akin to the Federal Communications Commission in the U.S.
"There is no general obligation to store this kind of data for all subscribers," PTS attorney Peder Cristvall told the magazine Computer Sweden.
Bahnhof says it's opposing the new antipiracy law since it stops Internet innovation and development, naming Swedish companies MySQL Skype , and Spotify as examples of companies whose success has benefitted from Sweden's extensive file-sharing culture.
Instead, Bahnhof says copyright holders must develop business models and Internet tools that allow subscribers to share files legally.
Karlung says that in the short term Bahnhof's profits will rise with the IPRED law due to lower bandwidth costs, but in the long term the sales of fast Internet connections used for file sharing could decline.
"It is possible that we and other ISPs could sell fewer fast connections, but that won't affect our profits," Karlung said.
But just to make things a bit more complicated, the Swedish government is expected to propose a new data storage law based on a third EU directive in June. This law could force Bahnhof to store and share its data in the future anyway, much to Karlung's disappointment.
"It is this Orwellian nightmare state that is developing, where no one sees the dynamic of the Internet," a sighing Karlung says from the other side of the AtlanticPosted by muhammad abbas at 1:31 AM 0 comments
Tech layoffs: The scorecard
With the overall economy slumping, the tech industry is taking its fair share of hits. We'll keep updating the chart below as news of company changes comes in. See our complete coverage of how the tech sector is faring here: Tracking the tech downturn.
See also: The spreadsheet of sunshine: Who's hiring.
Posted by muhammad abbas at 1:30 AM 0 comments