SiliconANGLE Extracting the signal from the noise. Tue, 23 Sep 2014 21:29:56 +0000 en-US hourly 1 BitPay announces groundbreaking Bitcoin payment partnership with PayPal Tue, 23 Sep 2014 20:29:03 +0000 Continue reading ]]> paypal-bitcoinBitPay just sealed community rumors and speculation today with an announcement that the Bitcoin processing company is partnering with PayPal. BitPay posted about the first step of the partnership on its blog, and that step has been to integrate with PayPal’s Payments Hub.

Merchants using the PayPal Payments Hub can now create a BitPay merchant account and receive API credentials for the Hub. This will allow customers with bitcoin wallets to pay with bitcoin for games, music, videos, news, ebooks, and other digital content via PayPay’s system.

“We believe Bitcoin offers unique opportunities as more people and businesses experiment with it,” said Scott Ellison, PayPal’s Senior Director of Strategy. “PayPal is excited to work with BitPay to offer new experiences and the trusted service our customers expect.”

BitPay isn’t the only company that PayPal is pulling into this Bitcoin partnership, the payments company is also joining up with Coinbase and GoCoin.

Ellison told Forbes the reason for the partnership between Bitcoin processors and PayPal is because, “Merchants were asking for Bitcoin integrations.”

This move comes after what seems like months of speculations of PayPal’s slow shift from appearing wary of cryptocurrencies towards adopting Bitcoin outright. First it was David Marcus, then President of PayPal, saying “We’re kinda thinking about it,” in reference to Bitcoin in 2013. Then in early September, eBay Inc.’s payment processing subsidiary Braintree (PayPal is also an eBay held unit) announced soon-to-come bitcoin acceptance.

Analysts close to SiliconAngle point to how PayPal’s partnership with three Bitcoin industry merchant processors points to a very sanguine outlook by the e-commerce giant. With three partners, it gives PayPal the chance to see real world examples of which platform will serve best in the future.

As for the industry itself, PayPal’s partnerships mean a solid milestone on the road to mass adoption. For any sort of mainstream adoption to happen, Bitcoin needs not just startups and merchants, but also a market and ecosystem that contains mature financial companies like PayPal.

Bitcoin market value jumps nearly $50

Friday saw the Bitcoin market index value tumble after the announcement of Alibaba Group Holding Limited’s IPO by approximately $42. Now that the announcement came down the pipe about PayPal partnering with BitPay, Coinbase, and GoCoin the market value has begun to surge, already regaining the entire loss and then some having shot up to peak near $452 since this morning at just below $400.

As of writing, the value continues to rise.

]]> 0
IoT gets battery-free chips, smarter cars and tags Tue, 23 Sep 2014 19:40:22 +0000 Continue reading ]]> This week’s Smart Living roundup features a rechargeable item tracking device, a battery-free chip, and cars that adjust to your personal preferences.





A great day can easily turn sour when you can’t find the keys to your car or house, just as you’re headed out the door. The advent of the Internet of Things has delivered trackers for a range of items, and even kids and pets. But the problem with these trackers is that if you forget to replace their batteries, the tracker is rendered useless.

Tintag Electronics is taking trackers to the next level with the launch of Tintag, the market’s first rechargeable item tracker, on Indiegogo Inc. Already Tintag has raised close to its $50K funding goal, with 45 days still left in its campaign.

What makes Tintag unique from other item trackers is that you don’t need to change its battery or ship it back to the manufacturer to have the battery replaced, as it comes with a Home Base wireless charger. Six hours of charging will power the device for about 4 months.

Tintag is waterproof, has an LED and buzzer notifications for finding items, features a Bluetooth antenna with a 100 m range, can be used to find your phone, and the mobile app helps you find Tintags. One Tintag can be connected to multiple phones so all members of your family can keep tabs of the same tracker, at the same time. Also, if ever an item, pet or kid gets lost, the Tintag community can help you find it. The finder simply inputs the unique Tintag ID at and the owner will immediately receive a notification. Since the Home Base wireless charger will only be used about three to four times per year, the people behind Tintag decided to make it more useful by adding hot spot functionality for monitoring all the Tintags.

Battery-free chips


The primary goal of the Internet of Things movement is to connect everything with every other thing through the web, but this will be a costly task to accomplish unless an affordable chip comes along. It looks like that chip has arrived.

Stanford assistant professor of electrical engineering Amin Arbabian leads a team that’s created an ant-sized chip unlike any other – it is not powered by any type of batteries.

The battery-free chip is an all-in-one chip with its antennae 1/10 the size of a typical WiFi antenna, able to send and receive signals. It has an onboard antenna that can translate and perform incoming instructions. For its power source, it gets energy from the signal it transmits and receives.

Though this type of chip may have a limited range, Arbabian believes that it is not a drawback as the chips can be strategically placed in homes or buildings to create a network, like a neuron.

Arbabian has worked with STMicroelectronics, Inc. to produce 100 of these battery-free chips as a proof-of-concept. One chip only costs a few cents to make, which can significantly lower the cost of connecting more things to the Internet.

Jaguars will soon be able to recognize you


Jaguar Land Rover Automotive PLC is knee-deep in its Smart Assistant project, the car-maker’s answer to the hopeful sector for smart, connected automobiles.

At present, car ownership in Europe has significantly dropped. In London, car ownership has dropped 8 percent, while Paris has seen a 9 percent drop, with a more noticeable decline in Munich of 16 percent. One reason behind this drop is that more consumers are choosing to spend their money on productivity-enhancing technology rather than a new car. To boost new car sales, Jaguar and Land Rover are making their automobiles smarter with the Smart Assistant, and the company plans to roll out this new feature in the next 24 months.

The Smart Assistant will use cameras to recognize the driver’s face, connecting the driver’s smartphone to adapt to the outside climate and adjust settings based on the person’s preferences. Like other connected dashboards, Smart Assistant aims to keep drivers focused on the road but still have mobile access without the use of their hands.

]]> 0
JumpCloud unveils Directory-as-a-Service (DaaS) Active Directory replacement in the cloud Tue, 23 Sep 2014 18:52:04 +0000 Continue reading ]]> JumpClouddevops-jumpcloud is an agent that releases IT departments and DevOps from tasks related to the management and monitoring of servers and, more generally, of the entire cloud infrastructure. In this way, startups and companies will be able to concentrate on the development and application aspects.

Now extending into cloud infrastructure, the Boulder, CO-based company has announced JumpCloud Directory as a Service (DaaS), a new offering that the company projects as an alternative to existing on-premise directory services like the Microsoft Active Directory (AD) and the Lightweight Directory Access Protocol (LDAP).

A true replacement?


Conventional IT in the Windows world has historically dealt with the Active Directory service to provide centralized services for identification and authentication via directory management.

JumpCloud’s DaaS offering brings user directory management to the cloud. According to the company, the DaaS service fulfills three core user needs: organizations who need a user directory; organizations replacing AD; and organizations extending their AD or LDAP to the cloud. Starting at $10 per user per month and free for the first ten users, JumpCloud supports the LDAP protocol and works with Windows, Mac, and Linux devices.

The service can authenticate, authorize, and manage users, devices, and applications. It acts as a single-user store for an organization or can extend existing AD and LDAP user stores to the cloud. The service also works with single sign-on (SSO) to manage multiple applications and directories.

“As we have seen with the transitions from Siebel to SalesForce, Exchange to Gmail, and file servers to DropBox, JumpCloud is enabling the transition of one of the last on-premise technologies to a fully managed service: the user directory,” says Rajat Bhargava, JumpCloud’s President and CEO. “Directory-as-a-Service brings IT teams the simplicity, ubiquity, and security of user and device management, all from the cloud.”

Good for developers


JumpCloud DaaS promises to help organizations be more agile and more effective by extending the on-premise directory service to the cloud or migrating locally-managed directory services to the cloud.

Developers now no longer need to extend their code to fulfill the AD requirements; the cloud based directory service can keep up services in terms of availability, scalability, and flexibility. Further, the company says DaaS enables IT admins to take advantage of major IT innovations, including cloud services such as IaaS and Gmail, and additional device/operating system combinations, such as Linux and OSX, among others.

Recently, JumpCloud extended the reach of its automation tools to more developers through the Marketplace for Rackspace Hosting. In addition, the company launched its cloud hosting-based server orchestration platform that help users to automate the execution of server management tasks and workflows across a group of servers. The orchestration platform significantly reduces errors and time spent on manual server management tasks and workflows, and increase uptime, consistency of operations, and availability by centralizing and making visible scripts created in an ad hoc fashion.

]]> 0
Puppet Labs gives the reins to DevOps with enhanced server and tools Tue, 23 Sep 2014 17:48:04 +0000 Continue reading ]]> puppet labs logoThis morning Puppet Labs unveiled a new features its DevOps automation software and a constellation of new company initiatives during PuppetConf 2014. The major updates to the Puppet open source and commercial platforms includes an all-new Puppet Apps platform, new management and reporting capabilities, and a major update to the Puppet language.

Puppet Labs CEO Luke Kanies revealed that the new Puppet version 3.7 improves overall performance by up to 300 percent. The update, he says, increases efficiency, stability, and flexibility for IT teams looking to automate deployment.

Speaking to SiliconAngle, Kaines said that the platform update represents the next gen Puppet server that takes into account the ever increasing role of cloud and virtualization in the enterprise as well as a notable increase in scale. He said that it’s becoming more common for enterprise users to scale to 50,000 nodes from 1,000 to 5,000 in previous years. This need for increased scale has led Puppet to work on increasing stability and flexibility at scale to match.

Flexibility with Puppet Apps

The Puppet Apps feature set is designed to allow developer and users to innovate quickly around core features with purpose-built applications that focus on solving IT automation challenges. While Puppet Apps work directly with the Puppet platform, they are released independently and therefore can allow for frequent updates to keep up with changing industry needs.

Node management and reporting in the hands of DevOps

With the Puppet Node Manager DevOps teams will have a better hands-on role when issuing and engineering automation for deployment. This update also adds new role-based access controls and increased granularity on Puppet server reporting with a profiler and metrics service.

The Puppet Node Manager makes it simple to orchestrate a large number of frequently-changing systems by allowing admins to manage systems based on their role rather than simply their names. Meaning that admins can set roles for different segments of infrastructure and then manage that infrastructure based on rules that then percolate through the system.

Security via role-based access control has also been added to Puppet Enterprise using granular RBAC capabilities—making sure the right people are the only people who affect mission critical changes—and it integrates directly with standard directory services including Microsoft Active Directory and OpenLDAP.

The role-based access control hooks directly into the Puppet Node Manager and will also be available to Puppet Apps and subsequent services.

Reporting is delivered via a system named Puppet Server Reporting, available with Puppet Enterprise, which adds a profiler and a metrics service to analyze and visualize everything that happens on the Puppet Server. The back end includes a UI and control panel that breaks down every action taken on the Puppet backend that displays a wide variety of metrics, including active requests, request duration, execution times, and compilation load.

The metrics are available for export to numerous third-party reporting and alerting apps such as those that support JMX, and the popular Graphite server.

Updates to the Puppet language

One thing that makes Puppet distinct from other DevOps automation tools is that Puppet uses a simple, domain-based programming language to enable configuration and descriptive infrastructure management. The language, also called Puppet, received several major enhancements with this newest update that greatly increases its functional and orchestration capabilities.

The enhancements include better usability, completeness and consistency of the language. Specifics include the addition of iterators, making it possible to do common data transformation and reduce manual repetition; the addition of a new data-type system that does parameter checking; the addition of Templates that can be written in Puppet instead of Ruby, making template writing easier; and finally the addition of enhanced error handling that will allow for easier root-cause investigation with more accurate reporting.


]]> 0
Infor CEO credits NYC creatives for Hook & Loop success | #Inforum14 Tue, 23 Sep 2014 16:44:05 +0000 Continue reading ]]> Hook & Loop With over 100 designers, Infor Global Solutions, Inc.’s internal creative agency Hook & Loop is one of the top three creative agencies in New York City. In an interview for theCUBE at Inforum 2014, Infor CEO Charles Phillips discussed why the company formed Hook & Loop and how basing its headquarters in New York City was key to the agency’s success.

Phillips believes that there are two reasons why organizations never really discuss the UI of enterprise applications: the people buying them don’t have to use them and the people using them don’t have a choice. As a self-proclaimed gadget guy, he admitted that he never liked the look of enterprise applications and wondered why they couldn’t look like the beautiful consumer applications created by companies like Apple Inc., Google Inc. and Samsung Group. Hook & Loop was created to solve this issue by developing enterprise applications that are as stunning as they are easy to use.

In the process of forming Hook & Loop, Infor wanted to go for people from a different world who created designs for entertainment purposes. This type of creative quality offers a great understanding on how to engage an audience, a characteristic that traditional enterprise design isn’t known for. “They think of UI differently,” said Phillips. He admitted that Infor didn’t have that skill set, and the only way for the company to get it was to find people who weren’t enterprise guys, who know and care about beauty and design. Phillips stated that there are a lot of people in New York City who fit this description and that this was one of the reasons why Infor based its headquarters there. “It’s hard to find these people. They’re not in some recruiter’s book,” he added.

]]> 0
Western Europe plays catch-up with Big Data Tue, 23 Sep 2014 16:00:40 +0000 Continue reading ]]> Western Europe Big DataWestern Europe is lagging behind the U.S. in terms of Big Data use due to a shortage of skills and worries over data security, according to a new report from International Data Corporation (IDC).

The rate of adoption has also been slowed by the area’s struggling economy and recent EU regulations like the “right to be forgotten”, which has called data usage and ownership into question.

IDC says European firms are particularly worried about this controversial new law, which “goes to the very heart of a company’s ability to mine even anonymised data”, and could “negatively impact the value of collecting certain data if a company is not allowed to use it via big-data tools for business purposes,” the report said.

Despite these concerns, IDC has an optimistic forecast for western Europe’s Big Data technology and services market. Its report, verbosely titled Western Europe big data technology and services 2011–2013 market size and 2014–2018 forecast by country and segment, says the market will grow from $2.3 billion in 2013 to $2.9 billion by the end of this year, reaching $6.8 billion in 2018. IDC’s figures represent a compound annual growth rate of 24.6 percent between now and 2018.

IDC splits Europe’s Big Data market into four segments: networking software, servers, storage and services. Of these, storage currently owns the lion’s share of the market, being valued at $536 million in 2013, followed by servers at $314 million.

Catching Up


“Western European organisations are catching up rapidly with their north American peers in terms of analytical maturity despite later adoption,” said IDC in a statement.

IDC says it sees a correlation between better organizational performance and greater use of Big Data analytics. However, it notes the shift from analytics to Big Data creates its own problems that make it difficult for organizations to gain value.

“Value from big data is far from guaranteed,” admitted IDC’s research director Alys Woodward. To counter this, Woodward suggests vendors do more to help their customers build Big Data systems that are tailored to meet their specific needs.

As far as general adoption goes, IDC expects compound annual growth rates for individual countries in Western Europe to range between 22.3 percent and 32.2 percent. Factors driving adoption include the extent of existing analytics in use and macroeconomics.

“The UK, Benelux and Nordics tend to show higher initial adoption, though Germany and France are catching up rapidly, while southern Europe still lags behind,” IDC said.

As a final note, IDS said future Hadoop-related investments would likely focus on buying services, with the arrival of smaller, independent vendors set to boost adoption of Big Data technologies beyond large enterprises.

photo credit: NATS Press Office via photopin cc
]]> 0
Infor has a “good shot at overtaking” SAP and Oracle | #Inforum2014 Tue, 23 Sep 2014 15:36:45 +0000 Continue reading ]]> Albert Pang Inforum14 theCUBE

Infor Inc.’s focus on micro-verticals, design, and research and development is “a transformation” that’s “taking the company to the next level,” according to Albert Pang, President of the market research company Apps Run the World Research, Inc.

Pang sees a bright future for Infor if it continues to stay focused on what it does best, especially since his company predicts the cloud applications market to more than double by 2018. Sharing his market insight with SiliconANGLE’s roving news desk, theCUBE, at the recently concluded Inforum conference held annually by Infor, Pang expects revenues from cloud-based enterprise applications are expected to jump to 33 percent from their current 10 percent of the market.

If Infor can a balance between embracing new technologies and helping move legacy companies forward, Pang predicts they will reach an  “unmatched level in the traditional ERP market,” overtaking rivals such as Oracle Corp. and SAP SE.

Watch Pang’s entire interview on theCUBE below.

]]> 0
Oncologist’s startup raises $3.7M to mobilize Big Data against cancer Tue, 23 Sep 2014 14:28:50 +0000 Continue reading ]]> COTA big dataAs powerful analytics have proven to be in the business and technology worlds, nowhere is data more impactful than in the healthcare industry, where providing the right people with the right information at the right time can quite literally make the difference between life and death. That reality has created a tremendous market opportunity for startups such as COTA Inc., which just nabbed $3.7 million in funding help oncologists gain a better understanding of cancer cases.

In industry circles, the healthcare analytics space is often thought of primarily in the context of the groundbreaking experiments and futuristic research IBM Corp. is helping to power with Watson. The cognitive computing software behind the Jeopardy!-winning artificial intelligence (AI), which was recently released as a cloud service, is helping institutions such as the New York Genome Cancer tackle monumental challenges ranging from deciphering the mysteries of the human genome to curing brain cancer.

But while there’s no denying that those ambitious projects hold far-reaching implications for the future of medicine, what IBM is doing with Watson has not yet produced any meaningful change for the everyday patient in the here and now. At most oncology departments today, the primary factor that determines the level of service an individual receives is not the number of petabytes in the hospital’s database or what algorithms it uses to scan that data but the cost of delivering care.

Delivering cancer treatments at scale


And that’s the issue COTA is trying to address. The startup is the brainchild of Dr. Andrew Pecora, a world-renowned expert in cellular medicine and immunology who set out to remove the mundane but increasingly important operational challenges that make it so expensive to deliver cancer treatments at large scale.

COTA, whose name is an acronym for Cancer Outcomes Tracking and Analysis, offers a cloud-based analytics platform designed to provide oncologists with a clearer view of their patients and the reporting functionality needed to effectively communicate that knowledge. The service combines clinical and operational data into what the startup describes as a consolidated view of cases as they progress.

COTA provides a visual interface that makes it possible for a doctor to isolate a subset of patients based on factors such as type and stage of disease, examine how that group fares from a medical standpoint and then correlate the results with business metrics like cost. The platform aims to pave a path for hospitals transition from the traditional model of charging for services to a so-called value-based care regimen, which was unfeasible in the past due to a lack of visibility into operations.

The ability to accurately categorize individual cases can be a powerful tool for improving care delivery in and of itself, COTA highlighted. For example, an oncologist might check past outcomes for the specific patient group to which an individual belongs to determine whether chemotherapy is worthwhile.

COTA said that the new $3.7 million in funding from fellow healthcare analytics provider Med-Metrix LCC. and New Jersey insurer Horizon Healthcare Services, Inc. will be spent on hiring biostatisticans and analysts to accelerate the development of its namesake offering. The capital will also be used to expand sales and marketing as the service moves closer towards general availability. The financing is part of a planned $7 million round that will come from Horizon.

Horizon marketing head Dr. Glenn D. Pomerantz, Med-Metrix CEO Joseph Davi and Dr. William T. DeRosa of the Regional Cancer Care Associates (RCCA) are joining the COTA board in conjunction with the investment.

Image via COTA, Inc.
]]> 0
BlackBerry’s Passport is a bargain-priced, square phablet for the enterprise Tue, 23 Sep 2014 13:27:36 +0000 Continue reading ]]> blackberry logo on leatherThere’s fantastic news out of Waterloo, Ontario, for those of you feeling nostalgic for physical keyboards on mobile phones. BlackBerry Ltd. will begin selling the uniquely shaped Passport smartphone on Wednesday.

Originally announced during BlackBerry’s quarterly earnings call in June, the struggling device maker designed the Passport specifically to appeal to enterprise users.  The 4.5 inch square screen is said to have exceptionally high screen resolution, with 1440 x 1440 pixels, and 453 ppi pixel density. This will allow for far more characters to be seen on each line, which would be beneficial when large amounts of information need to be seen at once, such as is often the case in the medical industry.

Some users feel that physical keyboards allow for faster, and more accurate typing. The Passport’s touch-sensitive keyboard will perform double duty, allowing users to swipe and make gestures, as well.

BlackBerry Chief Executive John Chen revealed another part of his plan to regain lost market share: strategic pricing. In an interview with The Wall Street Journal, Chen suggested that the fair market price for Passport would be in the range of $700. In an effort to boost interest, the phone will retail for $599, without subsidies.

The Passport is the first major handset launch since last year’s BlackBerry 10 models. Despite dwindling handset sales,  Chen has demonstrated that he’s determined to bring this area of the business to profitability. His goal is to offer enterprise customers a complete solution, which requires robust handset and services divisions. The staggering 96.4 percent  combined market share of Android and iOS continues to grow, and it will be a difficult trend to reverse. BlackBerry’s focus on the enterprise could be the Passport’s saving grace. That, and other unique advantages, such as the designation of BlackBerry 10 OS by the U.S. Department of Defense as the first and only Full Operational Capability mobile device, is reason enough to continue monitoring BlackBerry’s comeback efforts.

photo credit: SimonQ錫濛譙 via photopin cc
]]> 0
Database snapshots can cut backup costs, improve recovery points Tue, 23 Sep 2014 12:20:14 +0000 Continue reading ]]> numbers analytics big dataData backup costs can be decreased and recovery-time objectives (RTOs) improved by leveraging space-efficient database snapshots and a general-purpose catalog, writes Wikibon CEO and Co-founder David Vellante in his latest Professional Alert.  Disk-based, purpose-built backup appliances using data deduplication, pioneered by DataDomain (now part of EMC Corp., Inc.), solved several problems but are still plagued by three issues in the Big Data era:

  1. Data growth is outpacing the appliance’s capacity and ability to accommodate new requirements;
  2. Data growth plus the demand for 24 hour, seven day-a-week availability makes it difficult to complete backups within the available time windows;
  3. The appliances provide no visibility into the efficacy of the backups.

One result of this, a Wikibon survey of its members shows, is that many organizations cannot meet their desired RTOs for mission-critical applications.

Vellante argues that these problems can be reduced or eliminated by the use of space-efficient snapshots, which capture changes in a database at a specific point in time. Once a base copy of the database has been backed up to the appliance, rather than then backing up subsequent full copies with each backup, snapshots can be captured and, when a recovery is needed, applied to the base database copy.

One advantage of this strategy is that snapshots are a fraction of the size of the full database, alleviating the first two issues. Also, snapshots can be made frequently, improving the effective recovery-point from 24 hours or longer to as little as 15 minutes.

The one danger Vellante warns of is “copy creep”, the tendency to make constant snapshots just because it can be done. Thus it is important to establish a practical recovery-point objective (RPO) for each database and use that to determine the frequency of snapshots. Thus for instance if the business can live with a 12-hour RPO, a snapshot might be taken every 12 hours, not every 10 minutes.

The backup catalog


The backup catalog speaks to the lack of information on backup status from the purpose-built appliances. General-purpose catalogs capture information about which files have been backed up, where the backup is located and other critical metadata. Typically this information applies only to a single vendor’s backup solution and is locked inside the appliance where it can only be used by the system that stored the data.

A general-purpose catalog, Vellante says, can capture metadata on all copies of the database, including snapshots, and can be used to automate recovery from snapshots. It tracks when snapshots are made, if and when the snapshot’s differential data was copied and which snapshot is most current. When a recovery is needed, the catalog can adjudicate and automate which files should be restored, dramatically reducing the probability of administrator error.

To demonstrate the impact of this strategy, Wikibon has modeled its economics and potential savings in Capex and Opex versus the traditional backup approach using a backup appliance. Vellante discusses this model and illustrates it with two graphs in the report.

This strategy, Vellante argues, becomes increasingly beneficial as data volumes grow. Constantly adding purpose-built hardware and software to a hardened data backup appliance only acerbates the problems while increasing the expense. IT practitioners should investigate and test new backup strategies using application-consistent snapshots and a general purpose catalog to improve backup efficiencies, decrease costs, and create new business value.

The full Professional Alert, “Thinking Beyond Backup: Getting More from Snapshot Technology”, can be read without charge on the Wikibon website. IT professionals are invited to register for membership in the Wikibon community. This allows them to help determine the direction of Wikibon research, participate in that research, post questions and comments for the Wikibon analysts and post their own written research for the community.

photo credit: See-ming Lee 李思明 SML via photopin cc
]]> 0