SiliconANGLE Extracting the signal from the noise. Fri, 22 Aug 2014 19:14:42 +0000 en-US hourly 1 Oracle appliance gives more bang for fewer bucks Fri, 22 Aug 2014 19:14:42 +0000 Continue reading ]]> oracleHQThe Oracle Database Appliance (ODA) provides more value at significantly less cost than custom Oracle implementations on white box hardware for both integrators and their customers, writes Wikibon Co-Founder and Chief Technical Officer David Floyer in his latest Wikibon report. Returning to a favorite theme, how Single Managed Entities (SMEs) – complete stacks that include hardware, database, middleware, and ideally applications to increase efficiency and provide huge savings – Floyer analyzes the multiple ways in which an ODA is more efficient and less expensive in purchase, licensing and management than more traditional custom-built Oracle stacks.

Among the benefits detailed in the report are:

  • Oracle licensing costs are reduced because Oracle only charges for the virtual cores allocated to running the database on the ODA, versus its normal practice of charging for all virtual cores in the server.
  • Implementation is faster because all the components are standard, pre-tested and pre-integrated.
  • Support staff is reduced for both the independent software vendor (ISV) and the user because the standard stack is a known entity with proven reliability and few if any unique problems.
  • Users have a single “throat to choke” in the event of a problem.
  • Purchasing and financial management are simpler.

As a result of these and other savings, Floyer estimates that a typical installed price for an Oracle_Database_ApplianceODA including the ISV application running on it is approximately $700,000, compared to $1.2 million for a traditional custom system running the same Oracle database and application. Additionally the ODA is faster and less expensive to install, test and implement, which creates faster time-to-value.

As part of the research for this report, Wikibon interviewed three prominent Oracle resellers – Re-Quest Inc., Temenos Group AG, and Mformation, a subsidiary of Clearlake Capital Group. All three have adopted the ODA as the preferred solution for most customer needs. Re-Quest reports that customers were initially wary of Oracle owning both the software and hardware, but after seeing the advantages they now request the ODA approach.

The complete report, with a detailed discussion of the sources of the savings and other advantages of the ODA, is available without charge on the Wikibon Web site. IT professionals are invited to register for free membership in the Wikibon community. This allows them to influence the direction of Wikibon research and participate in that research, comment on published research, and publish their own questions, tips, Professional Alerts and longer written research.

Photo of Oracle headquarters courtesy Oracle Corp.
Graphic courtesy
]]> 0
Smart World Weekly: Wearable tech for ALS, open platform for IoT Fri, 22 Aug 2014 17:38:22 +0000 Continue reading ]]> This week’s Smart World Series features acquisitions, partnerships, and new releases to boost innovation in the Internet of Things market.

Image via Emberlight

Image via Emberlight

For those who missed anything in this week’s Smart World Series, here’s a chance to catch up on the latest developments in the connected world.  Each week, SiliconANGLE rounds up the top news trends regarding smart homes and cars, smart data centers and IT, smart infrastructure and all things related to the Internet of Things.

Smart lights shine lucrative business on startups


Samsung Electronics Co. Ltd. is boosting its efforts in the smart home market with its acquisition of smart home solution startup SmartThings for an estimated $200 million. Also featured in this week’s roundup is a cheap solution for smart lights and a cool digital frame that shows works of art.

Find out more about this acquisition and cool smart home gadgets in this week’s Smart Living roundup.

Wearable tech grants independence to people with limited mobility


The ice bucket challenge has taken the world by storm, raise awareness for amyotrophic lateral sclerosis (ALS) and to help fund research towards finding a cure.

We may be years away from curing ALS, but Royal Philips and Accenture plc have teamed up to create a trial application that could grant more independence to people suffering from debilitating diseases through the use of Internet-connected devices and smart home technology.

Find out more about how this app can help people with ALS live more independently in this week’s Smart Health roundup.

Enlighted gets $20M to intelligently light your office


Enlighted Inc. has raised $20 million in Series D funding from its current investors, namely Draper Nexus Ventures, Kleiner Perkins Caufield & Byers, RockPort Capital Partners, Draper Fisher Jurvetson and Intel Capital. As more buildings adopt smart lighting systems, some cities see these smart lighting systems as the solution to reducing  greenhouse emissions.

Find out more about Enlighted’s new funding and how smart lights can reduce greenhouse emissions in this week’s Smart City roundup.

IoT tinkerers get new Linux hub & open platforms


Developers have new toys and environments to create IoT apps with the launch of Cloud Media’s Linux-based STACK Box, an open source smart home hub that integrates with most smart home solutions already available in the market. This week’s roundup also features a new partnership for an IP camera platform and an IoT developer site launching soon.

Find out more what our IoT developer friends can look forward to in this week’s Smart DevOps roundup.

Tune in next week for more interesting stories, discoveries and innovations in the world of smart and connected things.

]]> 0
Microsoft unleashes its first NoSQL database into the cloud Fri, 22 Aug 2014 16:50:18 +0000 Continue reading ]]> Cloud Database NoSqlMicrosoft has just shipped out its first-ever non-relational database on Windows Azure, which also happens to be its first database product of any kind since releasing its legendary SQL Server.

Called “DocumentDB”, the database is a complete departure from Microsoft’s relational roots, being a schema-free, NoSQL offering built entirely for consumption as a service on the Azure cloud. The service was built following user requests for a fully managed database that could deliver query and transactional capabilities at scale, said Microsoft in its announcement.

DocumentDB is said to combine the database functionality of NoSQL with the transactional capabilities of relational databases. Vibhor Kapoor, Microsoft Azure’s product marketing manager, said it will run exclusively on the Azure cloud hosting service, providing basic document storage capabilities plus transaction semantics and query processing, which are two features commonly found in relational database systems. Kapoor says it’s ideally suited for organizations that require a simple back-end database for mobile or web application storage.

The database was in development for over a year, added Scott Guthrie, Executive Vice President of the Cloud and Enterprise at Microsoft, in his own lengthy blog post on the matter. He said DocumentDB is already running instances that are “hundreds of terabytes” in size, processing millions of complex queries each day.

DocumentDB is a NoSQL database, which means it’s schema free and allows users to store JSON documents and query them using a document-oriented SQL query language.

“DocumentDB has made a significant bet on ubiquitous formats like JSON, HTTP and REST – which makes it easy to start taking advantage of from any web of mobile applications,” said Guthrie, alluding to DocumentDB’s native support for JSON documents and wide range of programming libraries.

Guthrie also hyped up DocumentDB’s scalability, saying those instances now running are doing so with the “predictable performance of low, single digit” millisecond latency. Microsoft is hoping that developers will want to use DocumentDB in conjunction with the REST management API to manage things like subscriptions and billing.

If DocumentDB is even half as succesful as Microsoft’s more famous SQL Server relational database, Redmond will be pleased. The company first shipped SQL Server 1.0 way back in 1989, but it only really caught on with the release of SQL Server 4.21 on its Windows NT server operating system in 1993. Since then however, SQL Server has grown into a multi-million dollar business that drives much of the growth within Microsoft’s server and tools unit.

photo credit: CallieDel Boa via photopin cc
]]> 0
How Zynga uses Big Data to ensure new game success | #HPBigData2014 Fri, 22 Aug 2014 16:17:11 +0000 Continue reading ]]> zynga frontierville hat gamer playerIn gaming, analysis is critical and needs to take place in real-time. You would think the pressure of real-time analytics would cause stress for something as complex as video gaming, but that’s not the case with Zynga. At Hewlett-Packard, Co.’s Vertica Big Data Conference last week, Zynga’s Yuko Yamazaki, General Manager, and Joanne Ho, Engineering Manager, joined John Furrier on theCUBE to discuss how a standardized taxonomy and personalization models are the keys to its gaming analytics success.

For critical data, Zynga has a standardized taxonomy as a first tier. The company lets all of its games lock their own game specific tracking into its HP Vertica database and logs all user activities. It can then analyze that data at a later time. Zynga continues to improve on any success with a particular metric logged by a product team that, as Yamazaki put it, “makes sense to standardize.”

Yamazaki also mentioned that Zynga first has product managers and engineers decide what they want analyzed. This gets product managers and engineers thinking about what data that the analytics team needs to log and analyze for it to be able to move the business value forward.

“We don’t just capture everything that happens and then decide. We let them decide first and then click on what makes sense,” said Yamazaki. This process truly creates a data-driven culture at Zynga as everyone from game designers are thinking about what data needs to be analyzed.

Personalization models enhance user experience


Ho explained that in the past, Zynga developed personalization models to enhance user experience. She used the company’s model to increase the install of new games as one example. This model looks at current playing patterns, such as user engagement. It also examines the playing patterns of users’ friends to figure out how likely a user will be likely to install a game. This social graph helps Zynga introduce users to a game that they would be interested in.

Zynga has successfully launched a great deal of games within the past five years. The systems it has in place to collect and analyze data has helped the company identify patterns from one game to another, continue to improve user experience and have repeat success with new games.

See the entire segment below:

photo credit: slettvet via photopin cc
]]> 0
Top BYOx security tools used by the experts Fri, 22 Aug 2014 15:22:15 +0000 Continue reading ]]> man holding iphone using iphone logo back appleWearable shipments are on track to grow tenfold over the next few years, reaching 150 million devices (and worth $19 billion) by 2018, according to Juniper Research. And as mobile devices continue to infiltrate everyday life, the consumer impact on enterprise environments poses a threat to organizational security. The Bring Your Own Anything (BYOx) movement has introduced a bevy of smartwatches, fitness trackers, smartphones and tablets into the workplace, tapping into corporate networks and exchanging data amongst personal and business accounts.

In response, organizations must implement effective BYOx security solutions to protect their intellectual property, as well as employee data. The challenges IT departments encounter with BYOx security include a solution’s scalability, remote control capabilities and effectiveness across multiple devices and operating systems. Today we hear from seven  industry experts on the best BYOx security solutions and how organizations can protect vital data from malicious individuals or entities.

5 experts share top services & tools for BYOx security


Jordan Edelson, Founder and CEO, Appetizer Mobile

Blackberry has made an attempt to combat security risk with the launch of Blackberry Balance for OSX10, which splits the operating devices in two systems, with one profile for personal usage and one profile for work usage. Businesses can also utilize effective security solutions such as Samsung KNOX Workspace, which provides IT administrators with remote control of user devices and applications.

Putting MDM software in place will allow IT to scale BYOx security. These types of systems allow companies to administrate devices remotely, track and monitor the location of devices, and wipe the devices if need be.

I have found AirWatch and Soluto to be most effective in addressing BYOx security concerns.


Adam Ely, Bluebox co-founder

Adam Ely, Bluebox co-founder

Adam Ely, COO and Co-founder, Bluebox Security

Data Wrapping protects all corporate data, including email attachments, with document-level encryption and security policies that allow you to track, revoke or wipe data. AES 256 bit encryption will ensure that even if there is a data leak in BYOx, the data itself is secure.


  • Instant App Protect secures corporate data in any internal or public app on-demand; no SDKs or coding required. Not only does this free developer and IT time, it allows employees freedom to self-serve secure application access. Make sure that protection includes context aware policies to control data leakage on the mobile device, between apps, and over the network to cloud storage locations.


  • Create self-defending apps. Select technologies that can convert existing apps into self-protecting apps against platform-level vulnerabilities, app tampering and jailbreaking/rooting so you no longer need to worry about whitelisting or blacklisting applications.


  • Secure corporate data end-to-end — from your internal or cloud storage applications down to the mobile device. It’s not enough to just protect from attacks on the mobile threat surface, you also want to ensure security end-to-end.


  • Separate corporate from personal data with flexible configurations.


  • Privacy dashboards. Respect your employee’s need for privacy by fully disclosing to them what IT is, and just as importantly, isn’t able to track.


BYOx is shifting the focus away from mobile device management and toward corporate data management. BYOx is completely changing the way companies look at security by focusing on the data rather than the device. Now, IT must find a way to monitor and secure data that runs through personally owned devices.

Emerging technology solutions that provide real-time visibility into mobile usage patterns can help facilitate the dialog between security/IT and lines of business. Once security/IT gains insights into how business is being transacted on mobile, they can tailor security policies and protocols to reduce the risk associated with those activities. These tools can also help a company stay current with emerging mobile use cases so they stay ahead of mobile threats to the business.


Blake Brannon, Senior Solutions Engineer, AirWatch by VMware

The introduction of mobile devices in the enterprise and the consumerization of IT changed the security model for IT. Organizations can’t throw everything behind a firewall now. The post-PC era really shows the importance of using multiple layers of security that depend on the end user’s role in the organization and the information they access and collaborate on. With the pace of mobile innovation, organizations can no longer “set it and forget it,” but must regularly reevaluate new threats and opportunities that come with mobile enhancements. The key is to make enrolling in a BYOD program as easy and beneficial for the end user as possible.

AirWatch Workspace enables IT to containerize corporate data on an employee-owned device that maintains security for corporate data while keeping an employee’s personal data, apps, email and content separate. With AirWatch, IT admins set up policies based on smart groups or user groups customized to their specific company, which makes growing from 100 to 100,000 devices automatic.

In a BYOD environment, selecting a solution that is OEM agnostic (iOS, Android, Windows, BlackBerry, etc.) will increase employee adoption and IT control.


David Jevans, Marble Security founder

David Jevans, Marble Security founder

David Jevans, CEO and Chief Technology Officer, Marble Security

Importantly, on BYOD devices, the explosion of apps on these devices can pose a huge risk to enterprises both from data loss, credential theft, APTs, spear phishing and mining by apps of corporate directories. These issues are not solved by the mobile operating systems, nor are they solved by Mobile Device Management (MDM) systems. Dynamic security is needed on mobile devices, just as it’s needed on PCs, servers and networks.

Recommended tools and services to solve BYOx security include:

  • Mobile Device Management (MobileIron, Airwatch, IBM Maas360)
  • Mobile Application Management (see above)
  • Device Security (KNOX by Samsung for Android)
  • App Risk Management (Marble Security, Appthority)
  • Containers (Good Technologies, KNOX by Samsung)


David Appelbaum Sr. VP of Marketing, Moka5

Since there is no real standard for BYOx implementation — it still remains a mix of VDI, MDM, DaaS, and other approaches — IT is forced to plug the holes that their platform of choice has not addressed and so may layer on multiple solutions — which of course runs counter to end-user and operational performance expectations in the first place. The bottom line is that BYO is still very much in its infancy regardless of the hype.

We highly recommend Moka5 as the best solution for BYO security, management, and end-user productivity. From our perspective using Moka5 is simply the most scalable and least intrusive method for managing and securing vast estates of BYO devices. Since we use a fully encrypted, locally executing secure container, we can manage 14,000 containers from a single, standard Intel duel processor server. This level of scalability is impossible for VDI — which can typically only support 50-100 images per server depending on size. We leverage the computing power of the host rather than the server so our solution is vastly more scalable by design.


Cheryl Knight contributed to this article.

photo credit: CarbonNYC via photopin cc
]]> 0
Deadly Docker: Why containers are a threat to cloud virtualization Fri, 22 Aug 2014 14:31:31 +0000 Continue reading ]]> MikeFew technologies have more disruptive potential than Docker. Although the company and its namesake technology are barely two years old, almost every major cloud vendor has, or is in the process of, providing support or integration with Docker.

If you aren’t familiar with Docker, Brandon Butler at Network World has come up with one of the best definitions so far:

“Docker is both an open source project and the name of a startup that focuses on Linux Containers. Containers are the idea of running multiple applications on a single host. It’s similar to compute virtualization, but instead of virtualizing a server to create multiple operating systems, containers offer a more lightweight alternative by essentially virtualizing the operating system, allowing multiple workloads to run on a single host.”

Docker’s technology has generated huge excitement and clearly has massive potential, but does that mean everyone should swap out their virtual machines and replace them with containers?

Docker’s advantage


Docker’s containers can be hugely advantageous for anyone involved in cloud-based app development. Containers themselves might not be such an original idea, but Docker’s architecture holds great promise. Its lighter weight makes it a much better fit for widely distributed cloud-based platforms where there’s a need to move workloads to and from different resources.

As depicted in the following diagram, Docker does not need an operating system residing with the app in a virtual machine, nor does it require a hypervisor. Docker’s architecture is a lot simpler than traditional virtualization. It removes the need to collocate applications with an operating system inside a separate virtual machine, which is why it’s easier to move apps from cloud platform to cloud platform. In contrast, virtualization is heavier and that makes it more difficult to deploy, and more of a hassle to move apps between cloud platforms. Apps can therefore be built inside Docker containers, which can easily be moved around from on-premise to cloud, cloud to cloud, wherever you need them to be.

Docker vs VirtualizationImage credit: Docker


Disrupting DevOps


Docker also has certain advantages for developers. As Charles Babcock writes in Information Week, DevOps necessitates a high level of cooperation with operations managers and developers. Using Docker, developers can create their code without caring where it will be run, and they can make changes to their code without needing to worry about maintenance. Operations managers can accept code that’s already been tested, formatted and is guaranteed to be isolated from other bits of code in the production environment.

“With Docker, developers and operations, two groups that have perennially been at war can sit down at a table where a truce could break out and make it easier for both sides to get their jobs done,” explains Babcock.

The threat to VMware


One of the biggest question marks is whether or not VMware should be worried about the emergence of Docker. Docker’s light weight, simplicity, portability and appeal to developers means VMware cannot afford to just turn a blind eye.

“[Containers] can’t be matched in every way by sophisticated virtualization tools and management,” writes Babcock. “There’s evidence from IBM that containers deploy more quickly and run more efficiently than virtual machines. They can also be more densely packed on servers. That’s a big plus in the cloud, where overall efficiency remains a litmus test of who will thrive and who will die.”

That being said, virtualization does have a number of management advantages in enterprise data centers thanks to its wealth of legacy apps – an advantage it should retain for the foreseeable future. But unfortunately for VMware, that advantage is not likely to last.

“The next generation of applications, many of which will run in the cloud, are more likely to be built with containers in mind rather than virtualization,” writes Babcock. “When applications are composed as assemblies of many moving and distributed parts, containers will be a better fit.”

Virtualized containers, anyone?


MikeFor its part, VMware has so far downplayed any fears that Docker might chomp away at its business.

Asked about the competitive threat Linux container technology could pose to the vendor’s virtualization business, VMware CEO Pat Gelsinger said virtual machines are a “proven” technology for handling security, networking and management, and can run both legacy apps and new apps. In contrast, containers don’t offer the security benefits of isolation that are a core part of virtualization.”

But VMware is hedging its bets by incorporating container technology into its own product mix. VMware CTO Kit Colbert recently hinted as much, saying that he believes containerization will live side-by-side with virtualization in the future.

“We (VMware) see containers and virtual machines as technologies that function better together,” wrote Colbert in a blog post this month. “By combining containers and virtual machines, customers can improve their ability to deliver applications without compromising their enterprise IT standards.”

The gist of Colbert’s argument is that container technology is still a bit risky and lacks the kind of management and robustness that VMware’s virtualization systems provide. That means containers might work better when run inside a virtual machine, giving system admins the best of both worlds, he said.

VMware has scheduled several VMworld sessions around the topic, with the theme “better together.” Whether people buy into the message remains to be seen, but it’s clear that the long-held promise of virtualization to enable more efficient optimization of server resources is being effectively addressed by Docker.

photo credits: Free Grunge Textures – via photopin cc; i k o via photopin cc
]]> 0
USPS tracks 1 billion events per day : How the Internet changed mail | #HPBigData2014 Fri, 22 Aug 2014 13:43:59 +0000 Continue reading ]]>

Jim Cochrane CIO USPSThe United States Postal Service has changed significantly since the advent of Internet e-commerce, and those technological changes have been at times both disruptive and also profitable for the government agency. TheCUBE hosts John Furrier and Dave Vellante sat down with the postal service’s CIO Jim Cochrane to talk about some of those advancements in postal technology and some of the challenges such a large organization faces.

The hosts began by asking Cochrane what disruptive forces in the shipping industry have arisen lately. He mentioned that the crowd-sourcing of delivery and carrier companies is a big challenge. Also, the USPS is a large paper-based company. Therefore, digital technology in general is a disruption.

On the other hand, the explosion of e-commerce as a retail shopping tool of choice for many consumers has been a significant win for the postal service. While some predicted e-commerce would kill the postal service, the organization has instead increased revenue and transitioned from delivering more mail than packages to now delivering more packages than mail.

The USPS manages over 300,000 connected mobile devices that send tracking data to post offices and to customers in real time. This data, over 1 billion tracking events per day, ensures that the postal service is able to reach 151 million doors in a timely fashion and within the time frame customers have come to expect.

Efficiency & understanding


Tracking is a customer requirement, he explained, but the data the organization gathers from that tracking can help make it more efficient. They can now better understand how customers buy and choose products, which has created partnerships for the postal service with major retail companies.

When asked about the evolving role of CIOs, Cochrane pointed out that he did not come from an IT background but rather a marketing and operations one. A CIO in modern times needs that background knowledge along with the technological knowledge. “So, it’s not just making sure you have robust systems and they’re working. You need to know how you’re using those systems to create business value and customer value,” he said.

You can watch the full interview with Jim Cochrane, CIO and Executive Vice President of USPS right here.

]]> 0
Twitter’s Big Data crunching ‘BotMaker’ muscles in on spam Fri, 22 Aug 2014 11:50:15 +0000 Continue reading ]]> spam botIf you’re an avid Twitter user, you might have noticed a significant drop in the amount of spam messages and tweets bugging you. That’s because Twitter’s introduced a new anti-spam system called BotMaker that’s helped it to achieve a 40 percent reduction in its key spam metrics.

Twitter’s Raghav Jeyaraman describes in a lengthy blog post why fighting Twitter spam is much more challenging than defending against traditional email spam. He also revealed how Twitter’s developers went about creating BotMaker, and provides a simplistic look at its architecture.

Why spam loves Twitter


There’s a good reason why Twitter is so vulnerable to spam – it’s wide-ranging APIs, which are designed to let developers easily interact with the site, means that spammers “know (almost) everything” there is to know about how it functions. As a result, it’s proven very easy to create and distribute spam, and very difficult to deploy countermeasures against it.

Twitter’s real-time nature presents another problem too, because it means countermeasures that are deployed do not add to the latency of the user’s overall experience.

Keeping in mind these challenges, Twitter’s spam fighters needed to design a system that would do three things – prevent spam from being created; reduce the amount of time spam is visible; and reduce the reaction time to new spam attacks. At the same time, Twitter had to ensure that no one was able to tamper or bypass its system, and that it didn’t lead to more latency.

BotMaker to the rescue!


Such a complex challenge requires an even more complex system, and BotMaker was devised in three parts. “Scarecrow” is a low-latency subsystem designed to check for spam in the write path of Twitter’s main processes (tweets, retweets, favorites, messages and so on) in real-time. Meanwhile, “Sniper” is described as a “computationally-intense and learning sub-system” that checks in “near real-time” the user and content event logs of Scarecrow.

Finally there’s BotMaker itself, which is constantly being fed data from Scarecrow and Sniper. It’s job is to issue one of three commands to the write path (accept, challenge or deny), and also to the actioner (delete message, reset password, suspend), to cut out much of the spam. In addition to these efforts, Twitter runs periodic checks on all of the data BotMaker compiles to try and sniff out more spam and dodgy accounts.

botmaker_twitterImage credit: Twitter blog


The end result is an anti-spam system with a low-latency filter that’s capable of cleaning up spam with high-latency processes. It’s also capable of machine learning, which means it can adapt to get better as time goes by.

BotMaker’s rule language and data structures were built in a way that allows for rapid development, testing and deployment of system wide code changes. This allows BotMaker to quickly iterate and refine its rules and models in the evolving fight against spam.

“Spam evolves constantly,” wrote Jeyaraman. “Spammers respond to the system defenses and the cycle never stops. In order to be effective, we have to be able to collect data, and evaluate and deploy rules and models quickly.”

Jeyaraman explained that this was achieved by making BotMaker language typw safe, all functions pure and all data structures immutable, while ensuring the runtime supports common functional programming idioms.

photo credit: Tinkerbots via photopin cc
]]> 0
SDN? How about $$$DN? IDC predicts $8bn sales by 2018 Fri, 22 Aug 2014 11:00:54 +0000 Continue reading ]]> medium_1408154388The future of software-defined networking (SDN) is looking very rosy indeed, with the market segment set to rake in a healthy $8 billion by 2018, well up from the current pittance ($960 million) it’s set to bring through the door in 2014.

This optimistic forecast comes from the analysts over at International Data Corporation (IDC), and is streets ahead of previous estimates from Grand View Research, which reckons the global market for SDN will reach $4.9 billion by 2020. IDC says demand for SDN, which turns the high-end functions of networking gear like switches and routers into software, is being driven by organizations’ need for more flexible networks, especially as they deploy new solutions for cloud, mobility, big data and the Internet of Things

While major cloud service providers have been the earliest adopters of SDN, the next two years will be a “significant launch point” for SDN technologies in the enterprise, said IDC Research Director for Datacenter Networks Brad Casemore.

“SDN is taking center stage among innovative approaches to some of the networking challenges brought about by the rise of the 3rd Platform, particularly virtualization and cloud computing, adds Rohit Mehra, Vice President, Network Infrastructure at IDC. “With SDN’s growing traction in the datacenter for cloud deployments, enterprise IT is beginning to see the value in potentially extending SDN to the WAN and into the campus to meet the demand for more agile approaches to network architecture, provisioning, and operations.”

Any market that can rake in $8 billion is worth chasing, but with SDN it’s even more compelling when one considers how it compares with the wider networking market. Taking into account sales of ethernet switches, routers, WAN, WLAN, enterprise video, telepresence gear, fiber channel and inFiniBand boxes as well, IDC says the networking hardware market will be worth $50.14 billion by 2018. That’s an increase from $42.5 billion in 2014, which means the segment will actually grow faster than SDN in dollar terms, although far slower in terms of annual growth rates.

In other words, SDN is definitely a technology to watch. What’s less certain is how networking gear will fare in the future – will it become less important as SDN catches on?

That’s hard to say, but the most optimistic outlook is that this is good news all round, with SDN representing a burgeoning new market and traditional networking gear proving its still got lots of life in it yet.

photo credit: wstera2 via photopin cc
]]> 0
LibraTax liberates Bitcoin users from the hassles of taxation Fri, 22 Aug 2014 11:00:06 +0000 Continue reading ]]> mikeBack in March, the Internal Revenue Service announced that Bitcoin and other altcoins would not be considered as currency for tax purposes, but treated as “property” instead.

The announcement didn’t sit well, especially with miners who must report the fair market value of the virtual currency as gross income on the date of receipt, not to mention how this taxation will affect users, cryptocurrency payments processors, retail shops that accept cryptocurrencies, and exchange markets.

Computing for your tax obligations in regular currency is hard enough, but those using Bitcoin now face an even more daunting task. To keep Bitcoiners from going insane, somebody’s come up with a tool to calculate your cryptocurrency taxes for you.

LibraTax is the first tax preparation tool for Bitcoin and altcoins. It helps todetermine how much tax you should be paying when using Bitcoin to buy things, donate, pay for services, or even as a gift – in case people want to declare their Bitcoin spending.

LibraTax works by automating the accounting process related to virtual currency spending. It does so by retrieving the user’s transaction history from the public blockchain and synching it with the digital currency’s historical fair market value. The service allows users to track virtually all taxable events without having to manually calculate each transaction.

“The recent guidance given by the IRS in March to treat digital currency as property requires that taxpayers report digital currency gains and losses on state and federal returns,” Libra said. “Taxpayers and tax professionals alike have been uncomfortable and encumbered with this manual calculation because it is extremely time-consuming and prohibitively difficult.”

One major appeal of Bitcoin is its anonymity – you don’t have to reveal your identity when buying things. Unfortunately, if you use LibraTax, you need to disclose your Bitcoin addresses so your transactions can be analyzed. However, Libra does offer the option to upload a spreadsheet of all your transactions with no public addresses or identifiable details.

LibraTax is free to use until its official launch in mid-September, when it plans to release a premium version that could cost between $10 to $19. It also plans to release a version of LibraTax that especially caters to tax professionals, CPAs, and accounting firms, on a subscription basis.

photo credit: fd via photopin cc
]]> 0