Saturday, 24 February 2018

Cloud migration: The pros and cons of a common platform

Enterprise compute, storage, and database platforms are like rings of a tree. You add new ones each year, and they are mostly different from the last.

You use whatever platform is thought to be right at the time. That’s why you have mainframes, minicomputers, client/server systems, distributed systems, and open systems, with any number of processors, databases, and security systems. So you end up with 2,000 applications and about as many connected databases that run on all types of platforms.

And now, you are migrating to the cloud.

Should you consider moving many of these applications to a common platform, including processor, operating systems, and database? There are reasons to do that, and reasons not to do that.

Why to use a common platform in the cloud

The cost of management and operations is much higher if the platforms are heterogeneous. Consider the fact that you need to keep around multiple skill sets and tool sets for mainframes, open systems, Windows, Oracle, IBM, etc. to keep things running correctly. Indeed, the more diverse, the more money you’ll spend on ops.

So, if you can refactor as much as half of those applications to use common platforms, you’ll likely save 20 to 30 percent on ops and maintenance after the applications are migrated to the cloud. Although you typically can’t move everything to a common platform, the effort typically returns on the investment in three to five years, considering ops costs saved. 

Why not to use a common platform in the cloud

The downside is logical as well: the cost of moving applications to common platforms. The costs are all over the place and depend on how well the application have been designed and thus the amount of hassle to move them from, say, Windows to Linux, or the other way. I’ve seen conversions happen in just a day or in 30 days, depending on how much time is needed to get the app moved, tested, integrated, and deployed.

There is no single answer

These pros and cons really mean that you need to look at the applications before making an assessment. The truth is that many applications may not be economically viable to be moved to a common platform when they are moved to the cloud. Others are a no-brainer. You have to look to see which approach is best for each application.

https://www.infoworld.com

BYOG (bring your own glasses) will bring headaches for IT

vsplevel

The trickle of smart glasses entering enterprises on the faces of employees this year will become a flood in the years ahead.

As the use of smart glasses becomes more ubiquitous in the workplace, the challenges for IT departments and, indeed, for enterprises generally will grow.

And while the future of smart glasses is just now coming into focus, the unexpected consequences of this trend remain unclear.

What I believe is certain is that smart glasses are coming, they’re going mainstream, and very few organizations are ready for what’s coming. Let’s take a look.

Apple Glasses
In order to go mainstream, smart glasses have to look like ordinary eyewear. Apple is working on smart glasses designed to go mainstream.

Apple has many patents around smart glasses, but the most recently filed patent application addresses this issue. The idea is to create a lens within a lens — one lens for seeing the real world and the other for seeing the virtual world. The approach, called a catadioptric optical system, would enable Apple glasses to project very large virtual content from a very small projector.

The patent specifies the use of both outward-facing and inward-facing cameras in the glasses, which could be used for both gaze tracking and the mapping of the real world for the placement and anchoring of virtual objects.

Apple products always invite speculation by designers. Just this week, designer Taeyeon Kim posted some impressive (if overly fashionable) Apple Glasses concepts.

Samsung Galaxy Glass
Meanwhile, Samsung recently filed both a patent and a trademark that suggest a Samsung smart glasses product.

The company’s new patent in Korea describes “computerized vision-assisting eyewear,” which could be the long-rumored Samsung “Galaxy Glass” product.

Samsung recently filed a trademark for a logo that would represent “computer software for analyzing and configuring vision-assisting eyewear” and also “computerized vision-assisting eyewear consisting of a camera, computer and display to capture, process, and present images.”

That’s how a trademark lawyer might describe smart glasses.

Intel Vaunt
Intel is working on smart glasses it’s calling Vaunt glasses. They look close enough to regular glasses, but they do a few “smart things” that ordinary glasses don’t do.

Importantly, Vaunt glasses project visual information directly onto the back of the wearer’s right retina using low-powered laser technology. (It’s literally a “retina” display.)

Intel’s glasses don’t have cameras, speakers or microphones, although future shipping versions reportedly may contain microphones. They’re non-intrusive, visual and silent.

The wearer sees icons and red text, but only when looking down. Otherwise, the visuals are not visible.

Vaunt glasses should be able to provide walking directions, smartphone notifications and other "lite" forms of information.

They should get 18 hours of battery life, according to Intel.

The glasses can work with prescription or nonprescription glasses. They come in a variety of styles.

Intel plans to launch a beta program for the glasses this year.

VSP Level
VSP Global this week launched what it’s calling Level smart glasses. Think of Level glasses as fitness trackers. Instead of a watch or fitness band, however, they're glasses. Kind of like "Face FitBits."

The glasses contain a magnetometer, an accelerometer and a gyroscope, and they track movement, from which the included smartphone app can figure out distance, steps, calories burned and other quantified self data. They also contain a “Find My Glasses” feature.

I believe the most interesting aspect of VSP Level glasses is that they’ll be sold in the normal channel that serves optometry offices, starting in April in limited cities, including Sacramento, Seattle, Portland, Denver, Minneapolis and Washington, D.C.

This is the future of smart glasses. “Smarts” will be a checkbox item whenever you get prescription glasses.

What are smart glasses, anyway?
Pundits throw around terms that sometimes confuse more than illuminate.

For example, high-end devices such as Microsoft’s Hololens and Magic Leap’s One glasses are described as “augmented reality glasses” or “mixed reality glasses.” They’re also called “headsets” or “goggles.”

This category of device will for the foreseeable future be expensive, bulky and obtrusive. They will not be products for walking around, but instead will be used in controlled settings. They will specialize in high-definition, 3D virtual animated objects that appear to float in space.

Smart glasses are different, and when they go mainstream they will look almost like ordinary glasses or sunglasses, but will contain batteries and processors.

Smart glasses are for everyday, all-day wear. And their purposes are far less specific in their application. For example, smart glasses may only provide audio, or even blinking-light notifications. Or they could offer heads-up or low-res augmented reality. They might even simply gather data, or offer battery-powered, on-the-fly conversion to sunglasses.

In other words, smart glasses won't be a single thing. They could be tailored to individuals’ needs and do anything. And there will be a category of smart glasses for everyone.

Many users will wear minimalist glasses with microphones and bone conduction speakers for interacting with their virtual assistants.

Others will want glasses that function tomorrow like smartwatches do today: offering notifications and limited control over music, podcasts and phone calls.

Still others will do all that plus automatically or manually take pictures and videos, or perform rudimentary AR.

Even blind people will wear smart glasses. A Japanese researcher is crowdfunding “Oton Glass“ glasses for blind, visually impaired, dyslexic and others that translate text into voice.

In fact, it’s this extreme variety of function, combined with the necessity of eyewear, that will provide the greatest challenge.

Why BYOG will be such a challenge
We're facing the prospect of many or most employees carrying semi-concealed sensor bundles (sensors that include cameras and microphones) that connect either via Bluetooth, Wi-Fi or cellular networks and that track location.

It will be difficult or impossible to know which sensors and components are built into which glasses.

And in any event, the banishment of these sensor bundles will be extremely difficult, since they're also required for vision and therefore the basic performance of employees' jobs. You can ask meeting attendees or R&D visitors to leave their phones in a box outside, but you can't do that with glasses.

In addition to threats to trade secrets and heightened exposure to hacking, there will be new issues with illicit recordings and captured data between and among employees and by partners, customers and others.

Nobody has the answers to these challenges. But companies that want to stay ahead of the game need to start figuring out solutions sooner rather than later.

https://www.computerworld.com

Will Artificial Intelligence Spell the End for Customer Service Jobs?

An economic boom just dropped on the world -- and most, no doubt, aren't even aware. What happened? China's retail and technology conglomerate, Alibaba, developed an artificial intelligence model that beat the humans it competed against in a Stanford University reading and comprehension test.
This is historic.

"This is the first time that a machine has outperformed humans on such a test," Alibaba said in a statement.

But already, history's repeated.

Microsoft Research Asia, shortly after Alibaba's breakthrough, announced its own AI had beaten the humans on the same reading and comp test, as well.

The news generated a small media earthquake through the fields of technology and AI. But it should've taken front and center on the economy and market watch pages as well. When all's said and done, the more AI advances, the less secure some workers' jobs become.

This is the oft-sad but true trade-off of technological achievement, and it's one that leads, as most roads seem to do, to a reckoning of dollar-and-cent realities.

So what's the deal-io here?

The SQuAD -- Stanford University reading and comprehension -- is a dataset based on 500-plus Wikipedia articles, upon which 100,000 questions and answers were created. An example: Which NFL team represented the AFC at Super Bowl 50?

Another example: What are the three sources of European Union law?

The human standard score on this test is 82.3. But Alibaba and Microsoft AI both beat that, scoring 82.4 and 82.6, respectively. Other AI programs, like Allen Institute for Artificial Intelligence, Tencent and Salesforce, came close to beating the human standard.

Why Care?

It's not perfected yet, but a fully achieved goal for AI on reading and comprehension holds massive promise to change the world of work -- which is to say, to dramatically reshape the economies of not just America's, but of other nations, too.

"[T]he general techniques used in this test are very applicable to various commercial applications, including question answering or fact finding in response to people's searches or information requests (on web or mobile) and in customer service applications," said Christopher Manning, the Thomas M. Siebel professor in machine learning in the Department of Computer Science at Stanford University, in an email.

That means humans who work in jobs that require them to feed information to the public are at risk of AI replacement.

This is not hyperbole.

"[I]t is reasonable to assume that over the coming decade, a significant proportion of human customer service representations -- not all, but probably the majority who answer more routine questions -- will be replaced by digital assistants," Manning said.

In other words: The layoffs are coming -- and they're coming soon. Moreover, it's not just the people who answer phones.

Alibaba has already installed the technology on computers tied to Singles Day, a worldwide shopping event, as a means of fielding questions from buyers. The conglomerate is planning to gradually apply it to "museum tutorials and [for] online responses to medical inquiries from patients, decreasing the need for human input in an unprecedented way," the company said in a statement.

"Unprecedented" -- that's code for anywhere questions are asked.

Customers may like it, at least initially. After all, who wants to wait on hold for an hour for a live human when a machinery-made customer service rep can do just as well, and in jig time? Or an online chatbot?

But for those whose livelihoods depend on the customer service sector, the view of all this AI achievement is quite different.

The Bureau of Labor Statistics finds that the typical entry-level education requirement for most customer service representatives in 2016 was simply a high school diploma, or the equivalent. For that, workers in this sector earned, in the median, $32,300 a year, or $15.53 per hour. Not bad; in many places in America, that's enough to more than eek out a living.

It's a particularly decent living for those who have no desire, or no means, to go to college.

Companies may look at those numbers and see an AI way to cut payroll. And good for them. This is not a message against free-market capitalism. But the AI-related repercussions for the country at-large? Consider this: An employed America is a happy America -- an America where self-sufficiency and independence is the standard expectation, and where government hand-outs are kept to a minimum.

Companies considering this new AI technology ought to simultaneously offer retraining for their customer service employees to obtain jobs in different fields. AI may be good for business, which translates into a booming economy. But then again, a country with low unemployment is booming for the national economy, as well. It'd be a shame for companies to see big gains from this chugging AI technology, only to see the country as a whole suffer from higher taxes to pay for the many unemployed.

https://www.newsfactor.com

Juniper's new products help prepare networks for hybrid, multi-cloud

Earlier this week, Juniper Networks announced a bevvy of new networking products. (Note: Juniper Networks is a client of ZK Research.) In his blog post about the products, Andy Patrizio did an effective job covering the basics of the news. But left out some important points, and I wanted to make sure those got called out, including Juniper's tagline “multi-cloud ready."  

Hybrid, multi-cloud is inevitable 

As I’ve pointed out in many of my posts, hybrid multi-cloud environments are inevitable for most organizations. Small businesses may be able to build an IT strategy that is all public cloud, but any large company is going to choose a mix of private and public clouds.

The reality for those large companies, though, is no matter how great their desire to achieve the utopian state of hybrid, multi-cloud, they just aren't ready to make that leap. These cycles can take a long time and will require some combination of infrastructure upgrades, organization realignment, skills retraining and new technologies.

The updates to Juniper’s portfolio allow customers to upgrade their network today to keep up with bandwidth demands, and then gracefully transition to a more cloudy business when they are ready. Juniper CTO Bikash Koley talked about this challenge at length at the company’s NXTWORK user conference held at the end of 2017, and now the company is backing up that talk with some products.

Juniper's new products provide flexibility, options 

Juniper switches continue to evolve, but the underlying strength of them remains its software, Junos, as it creates flexibility and options for customers. For example, Juniper introduced some new data center products.

On the surface, the QFX1002 spine switch, which has 60x100 Gig-E ports looks similar to the QFX5210 switch with 64x100 Gig-E ports. Other than four more ports, what’s the difference? The QFX10002 is built on Juniper silicon and is optimized for deep buffers, whereas the 5210 uses Broadcom and is a bit cheaper and meant for shallow buffer. I won’t get into the benefits of deep buffers. For those interested, Juniper’s Praful Lalchandani does a nice job highlighting the benefits in this post.

The different silicon families (Juniper also runs on Intel Atom) underscores the flexibility that Junos brings its customers. Customers can leverage the different products where needed but have a single set of scripts, automation tools, management platforms, etc.  The value of the single platform also extends outside the data center, as it makes it easier to manage the network as a whole.

In the past, I’ve felt that the “single OS” story was a solution looking for a problem because customers rarely told me they cared if their branch equipment had the same OS as their data center. That was because the various places in the network were managed in silos. Today, that’s changed, and network professionals are looking at the network more holistically, so the common architecture and management capabilities make a difference.

Juniper's new data center products

There are a couple of other new data center products worth calling out. The first is the Broadcom based QFX5200, which has native 25 Gig-E support. Juniper had 25 Gig-E products before but required breakout cables, which can be messy and are not ideal.  

Another new data center product is the QFX MACsec line card that enables encrypted communications between locations. Companies looking to create dual data centers to power their private cloud will find this valuable.

Patrizio's post also mentioned the updates to the branch portfolio, so I won’t go into those in detail. Like the data center products, the branch ones are designed to address the needs of today but provide the flexibility to go cloud later. Customers can manage the products via the Contrail SD-WAN controller or from a new cloud management tool called Sky Enterprise. Cloud management is becoming increasingly more popular was was a hole in Juniper's product line has now been filled.

Juniper's new campus products

Juniper also announced several new campus products that were not included in Patrizio’s post. These products can also be managed via the Sky Enterprise cloud management tool. Sky also supports Aerohive WiFi access points, building on the strong partnership between the two companies.

Sky isn’t meant to replace Aerohive’s HiveManager. Instead think of it as “HiveManager lite,” which provides visibility across the wired and wireless access edge. The new EX2300 and 4300 brings multi-gig capabilities to the Juniper campus product line. 

Juniper is certainly late to the multi-gig party, but I suspect we won’t see a significant uptake of it until 802.11ax APs are out, so it should be able to catch up relatively quickly.

Juniper has been pushing the “operational simplicity” message for the better part of the past two years. Making the shift to the cloud requires a highly agile network, but also one that is easier to manage. The flexibility and automation capabilities that Junos brings enables customers to solve bandwidth and operational problems today, as well as sets them up to move to a multi-cloud model when their company is ready.   

https://www.networkworld.com

IoT: Sensor-as-a-service, run by blockchain, is coming

Sensor-as-a-service, run by blockchain, is coming for IoT
Telecommunications equipment maker Nokia has launched a turnkey, sensor-as-a-service offering for Internet of Things (IoT) networks.
The idea behind the product is to provide a way for mobile network operators (MNOs), many of which use Nokia cell site equipment, to monetize existing infrastructure, such as towers, by selling live environmental sensor data to cities and others.
MNOs increasingly are looking for new revenue sources as consumer smartphone growth plateaus. And cities need to adopt digital strategies to manage assets, increase efficiencies, and keep stakeholders happy. For example, they need granular real-time data about public transportation flow and air quality to ensure they comply with regulations—that the traffic is flowing and no illegal garbage is burning.
Nokia, which announced Sensing as a Service to coincide with Mobile World Congress taking place in Barcelona next week, says its product will achieve that.
"Cities need to become digital in order to efficiently deliver services to their habitants,” says Asad Rizvi, head of global services business development at Nokia, in a press release. “In addition, we can help [mobile network] operators generate new revenue.”
Blockchain used for billing/payment
Ingeniously, the offering includes a blockchain smart-contract element, which the Finnish company says will allow MNOs to bill smart city governments for processed and analyzed data. Smart contracts are mechanisms that automatically transfer funds, including micro-payments, between parties when certain conditions are met. They use blockchain decentralized ledgers.
Enterprises need IoT skills
Sensor-as-a-service may gain momentum. That's because IoT strategy adoption can be a stumbling block for enterprises overall due to insufficient internal resources.
Vodafone, in a recent IoT Barometer study, says 10 percent of businesses don’t think they have the right skills. And 75 percent of IoT adopters have expanded their use of collaborators to get and run IoT-based solutions.
“They’re looking to partners and collaborations to fill their skill gaps,” says Vodafone, which also supplies telecommuncations—in its case, networks.
IoT data for smart cities
In the case of Nokia’s sensor-as-a-service offering, which it alternately calls IoT for Smart Cities, the company says parking, trash management, environmental sensing, smarter lighting, and security (e.g., video surveillance) are primary commodities that MNOs can offer municipalities. They use existing base station sites, also known as towers.
Enterprises may also want access to that kind of data, Nokia says. That could include weather forecasting operations, healthcare and insurance. Smart cars and drones will also require better municipal data, the company says.
There’s minimal capital expense, and the blockchain “anonymized, private and secure micro-transactions” pays the MNO from the city coffers. Nokia, not the MNO, performs all of the edge gateway and sensor installation. Data is stored on Microsoft Azure, Amazon Web Services (AWS), or a private cloud. The Nokia AVA platform or a choice of Amazon IoT or Microsoft IoT performs the data transfer.
“Currently, when most [mobile network] operators are considering their data, they only think about telecommunications data,” Nokia says in a brochure. “Only by branching out and collecting additional data sources can you truly deliver a compelling data service or product that generates revenue.”
In other words, tracking consumers as they wander a contracting retail environment and selling that data to marketers, called telco-data-as-a-service (TDaaS), is conceivably just the beginning of data monetization possible for telcos.
https://www.networkworld.com

5 ways blockchain is the new business collaboration tool

While blockchain may have cut its teeth on the cryptocurrency Bitcoin, the distributed electronic ledger technology is quickly making inroads across a variety of industries.

That's mainly because of its innate security and its potential for improving systems  operations all while reducing costs and creating new revenue streams.

David Schatsky, a managing director at consultancy Deloitte LLP, believes blockchain's diversity speaks to its versatility in addressing business needs, but "the impact that blockchain will have on businesses in various industries is not yet fully understood."

[ Related: The top 5 problems with blockchain ]
In 2017, blockchain technology started to become a key business focus for many industries, something a Deloitte survey conducted in 2016 had predicted.

That online survey of 308 blockchain-knowledgeable senior executives at organizations with $500 million or more in annual revenue found many placed it among their company's highest priorities. Thirty-six percent believed blockchain could improve systems operations, either by reducing costs or increasing speed, and 37% cited blockchain's superior security features as the main advantage. The remaining 24% saw its potential to enable new business models and revenue streams.

Although 39% of senior executives at large U.S. companies had little or no knowledge about blockchain technology, the rest said their knowledge ranged from "broad to expert – and 55% of that group said their company would be at a competitive disadvantage if it failed to adopt the technology.

The bottom line: both the understanding of and commitment to blockchain varies by industry. But most see it as disruptive.

"It is fair to say that industry is still confused to a degree about the potential for blockchain," David Schatsky, managing director of Deloitte LLP,  said in a statement last summer. "More than a quarter of surveyed knowledgeable execs say their companies view blockchain as a critical, top-five priority. But about a third consider the technology overhyped."

Those already embracing blockchain are finding a new independence in their ability to transmit both sensitive data and money securely, enabling a new business dynamic.

Blockchain is a decentralized electronic, encrypted ledger or database platform -- in other words, a way to immutably store digital data so that it can be securely shared across networks and users. As a peer-to-peer network, combined with a distributed time-stamping server, blockchain databases can be managed autonomously. There's no need for an administrator; the users are the administrator.

Blockchain eliminates huge amounts of recordkeeping, which can get confusing when there are multiple parties involved in a transaction, according to Saurabh Gupta, vice president of strategy at IT services company Genpact. "Blockchain and distributed ledgers may eventually be the method for integrating the entire commercial world's record keeping," he said.

Smart contracts
Blockchain distributed ledgers can be used to automatically execute business contracts. The peer-to-peer database first captures all terms and conditions between an organization and its customers, then uses data gleaned across distributed nodes or servers to determine when those conditions have been met and payment is authorized.

For example, IBM, AIG and Standard Chartered Bank just announced a pilot project to streamline one of the most complex types of policies in the insurance industry - a multinational policy.

The three companies created a master policy written in the UK, and that includes  three local insurance policies in the U.S., Singapore and Kenya, into a “smart contract” based on blockchain technology that provides a shared view of policy data and documentation in real-time.

The solution is structured so that multiple parties in the network -- including brokers, regulators and auditors -- can collaborate more effectively and efficiently, according to IBM. The solution gives all of the parties a unified view of policy and payment data and documentation so that they can make informed business decisions based on a common set of trusted data.

"It is designed to infuse trust and transparency into the insurance underwriting process across borders," an IBM spokesperson wrote in an email to Computerworld. "When critical data about the policy is stored on the blockchain, all permissioned parties in the network have a single view of the data, and no single party can make changes without the consensus of other members."

Because it's based on blockchain:

It provides the ability to record and track events and associated payments in each country related to the insurance policy.
No one party can modify, delete or even append any record without the consensus from others on the network.
This level of transparency helps reduce fraud and errors, as well as the need for the parties to contact each other to view policy and payment data and the status of policies. 
Blockchain-based smart contracts can be used to automatically execute payments between financial institutions.

Accenture recently released a report that claimed blockchain technology could reduce infrastructure costs for eight of the world's 10 largest investment banks by an average of 30%, "translating to $8 billion to $12 billion in annual cost savings for those banks."

Payments, clearance and settlement in the financial services industry -- including stock markets -- is rife with inefficiencies because each organization in the process maintains its own data and must communicate with the others through electronic messaging about where it is in the process. Because of that, settlement typically takes two days. In turn, delays in settlements force banks to set aside money that could otherwise be invested.

With its ability to instantly share data with each organization involved in a blockchain database or ledger, the technology reduces or eliminates the need for reconciliation, confirmation and trade break analysis as key parts of a more efficient and effective clearance and settlement process, according to Accenture.

Enabling businesses to avoid transaction fees
Most payment systems are administered by financial institutions, such as banks. When money is transferred between businesses, there's typically a fee associated with it -- especially for small to mid-sized businesses.

Large enterprises have always enjoyed an advantage in the global market, be it the capital to absorb the cost of transfer fees (or getting lower fees), better intellectual property protection, and a host of other advantages that come with having more  capital and greater influence.

Blockchain technology helps level the playing field, enabling SMBs to compete in that global market.

For example, the B2B payment service Veem leverages blockchain to allow its SMB customers to transfer funds internally for no fee; that compares to larger banks that charge around $50 per wire transaction.

Veem's CEO Marwan Forzley believes blockchain is an opportunity to "remove the middle man from international transactions, which directly impacts the experience of paying suppliers and contractors, the timing of these transactions and the fees that are directly impacting the SMBs bottom line."

Sharing patient data, ensuring doctors get paid
While electronic healthcare records (EHRs) have helped in the centralization of patient data to some extent, sharing that sensitive information with various healthcare providers, such as medical specialists, can be difficult at best because EHR platforms are not standardized across organizations.

Healthcare organizations could use the cryptographically secure, decentralized blockchain ledger to pre-authorize the sharing of a patient's information.

Last year, the MIT Media Lab and Beth Israel Deaconess Medical Center tested a proof-of-concept that shared information about patient medications through a blockchain ledger called MedRec. MedRec was based on the Ethereum blockchain platform for smart contracts.

In their analysis paper, titled "A Case Study for Blockchain in Healthcare," the MIT and Beth Israel Deaconess Medical Center researchers found blockchain "could contribute to secure, interoperable EHR systems."

In addition, healthcare IT vendors and the U.S. government are exploring blockchain's potential. Earlier this year, IBM's Watson Health artificial intelligence unit  signed a two-year joint-development agreement with the U.S. Food and Drug Administration (FDA) to explore using blockchain technology to securely share patient data for medical research and other purposes.

IBM Watson Health and the FDA plan to explore the exchange of patient-level data from several sources, including electronic medical records (EMRs), clinical trials, genomic data and health data from mobile devices, wearables and the "Internet of Things." The initial focus will be on oncology-related information.

Healthcare is also hampered with an inefficient payment system, where insurance companies fight with providers.

"Insurance companies have already given prior approval based on medical necessity or preauthorization and I've got to fight collect it..., really?" said Gene Thomas, CIO of Memorial Hospital, a 445-bed facility in Gulfport, Miss. "Depending on who you talk to, 17 cents, 21 cents... of every healthcare dollar is spent on collections. Are you kidding me?"

Underpinning a shared ledger where all parties involved in a healthcare insurance contract -- patient, provider and payer -- all see the same information at the same time, blockchain has the potential of smoothing out the "arduous, high cost, high friction process.

"Everyone's posting to the same thing, it's all transparent. I've got high hopes that if there's any place blockhchain could actually have an impact in healthcare, it's on [the] revenue cycle side," Thomas said.

In light of that need, Deloitte's report found that healthcare and life sciences have the most aggressive deployment plans for blockchain of any industry, with 35% of survey respondents indicating their organization plans to deploy blockchain within the next year.

Timestamping documents
Timestamping is the process of wrapping metadata or other information in a block of an ongoing blockchain, which creates an unchangeable or immutable record tied to everything that comes after it in the chain. Think of it as satisfying a public notary's function.

For example, Swiss company Gmelius has created a Gmail Stamping app that's essentially a secure way to verify the integrity of an email using the Ethereum blockchain open-source platform. (Gmelius detailed the blockchain architecture it's using in a PDF.)

Gmelius' app is a Chrome browser extension that is available in the Google webstore. Once downloaded, it offers several capabilities for Gmail users, including removing Google ads, scheduling emails to send at later times and dates, and blocking senders from seeing whether you've read their email.

But it's Gmelius' email stamping feature that takes advantage of blockchain to authenticate the origin of an an electronic message.

Upon hitting send in Outlook, the original email – including its headers, subject line, body and attachments – is encoded using the base64 binary-to-text encoding scheme and converted into a hash using the SHA2-512 hashing algorithm. The email is then signed using Gmelius's proprietary 512-bit RSA key. Encrypting the hash allows the email to become part of an anonymous transaction anchored onto an electronic ledger via the Ethereum open-source blockchain platform.

The blockchain-based application proves the existence, integrity and ownership of any email sent, which would enable it to be used for business purposes such as legal contracts.

This spring, Gmelius also plans to release a blockchain-based app that will allows users to electronically sign documents attached to emails using blockchain's smart contract feature.

Selling energy through microgrids
Because of blockchain, residents of the Park Slope area of Brooklyn are now able to sell power generated from rooftop solar panels via a microgrid enabled by a blockchain ledger that records every transaction made with a local utility.

The physical microgrid, set up by Siemens Digital Grid Division, includes network control systems, converters, lithium-ion battery storage and smart electric meters. In case of another hurricane like Sandy in 2012, residents on the microgrid would continue to have power for a time -- even during a blackout -- as they could switch  to battery reserves.

A microgrid is a form of distributed energy generation that can function independently from the traditional, centralized regional power grid; it can enable towns, small cities or corporations to develop their own energy sources and power storage systems (via lithium-ion or flow batteries), distribute that energy and even sell excess power back to local utilities.

The Brooklyn Microgrid blockchain database is a web-based bookkeeping system that uses cryptographic technology to save energy data in a way that is both inexpensive and forgery-proof, the companies said.

The Brooklyn Microgrid enables residents to sell energy back to the local utility -- a process known as "net metering" -- and it allows those without solar panels to purchase green power credits from their neighbors. The blockchain platform for the microgrid is enabled by Brooklyn-based energy startup LO3 Energy.

The same blockchain technology that allows residential solar power users to sell excess power back to utilities can do the same for businesses seeking to lower their electricity costs.

https://www.computerworld.com

Hospital collaborative makes advances in AI-based healthcare

There should be no question that the world sits on the precipice of a major sea change where artificial intelligence (AI) and machine learning (ML) will be as pervasive as air and infused into our lives, similar to the way the Internet has. AI can analyze information, find anomalies and make informed decisions faster than people can and will be an important tool to help us do our jobs better.

However, the adoption of AI varies widely by vertical. One of the industries that has embraced AI is healthcare, as it can have lifesaving consequences. 

A great example of this is the work that the Boston-based MGH & BWH Center for Clinical Data Science — a collaboration by Mass General Hospital (MGH) and Brigham and Women's Hospital (BWH) to create, promote and commercialize AI for healthcare — has been doing to accelerate the process of analyzing magnetic resonance images (MRIs).

I recently talked with Neil Tenenholtz, senior machine learning scientist at the MGH & BWH Center for Clinical Data Science (CCDS), to better understand what the organization is doing and how it is leveraging the NVIDIA DGX Station to power their initiatives. (Note: NVIDIA is a client of ZK Research.) 

DGX Station brings data center power to the desktop

DGX Station can be thought of as a desktop-size, GPU-enabled super-computer. The primary AI platform from NVIDIA had been the DGX-1, a rack-mountable server that holds eight NVIDIA Tesla V100 GPUs. The challenge with a data center-housed server is that a data scientist would need to work with IT to have the DGX-1 provisioned for that particular task. This usually creates a lag time that could get in the way of the researcher being able to analyze data and conduct deep learning experiments.

DGX Station is a portable, workstation form factor that is powered by 4 Tesla V100 GPUs. Although the form factor is compact, the compute power of a 4 GPU system is the equivalent to approximately 400 x86 CPUs, making DGX Station ideal of computationally intensive processes such as AI, machine learning and data analytics.

AI can improve radiologist efficiency

The mission of the group Tenenholtz works for is to develop AI-based models specifically for radiology and integrate them into the radiologists' clinical workflow. Successful models can be licensed to external partners or spun out and commercialized through startup, so they may be used by hospitals all over the world.

At a high level, the center is trying to prove that AI-based models can improve radiologist and physician efficiency by being faster and more efficient than when the analysis is done manually. It’s important to understand that AI isn’t being used to replace the clinicians, but rather be another tool for the next-generation radiologist.

GANs accelerate AI training

The current project the DGX Station is being used for is a brain “GAN” for MRIs. A GAN is a “generative adversarial network,” which is a class of AI algorithms used in unsupervised machine learning that consists of two components: The generator creates synthetic images, and the discriminator evaluates them to determine if they are real or not. Both components can potentially be highly valuable.

In the CCDS use case, the generator creates synthetic brain MRIs, and the discriminator evaluates them. Tenenholtz explained that the goal of the GAN is to train the two systems in parallel. Over time, the generator will become more accurate in creating synthetic brain images that can be used to augment the training set of tumor segmentation algorithms.

On the other hand, the discriminator, which learns to detect brain-like features, can be leveraged in future applications of “transfer learning,” a process where an algorithm trained on one task can be applied to another with fewer training examples than is normally required to train the algorithm de novo. Therefore, the discriminator could potentially be used to detect other, more rare brain abnormalities.

Without the availability of a pre-trained model, tens of thousands of images would need to be analyzed. With it, the number can be reduced significantly.

Turnkey nature of DGX Station leads to “out of the box” AI research

I asked Tenenholtz about the benefits of using DGX Station, DGX-1 or another computer with GPUs in it, and he explained the rationale of using DGX Station.

MGH currently has four DGX-1 servers in their data center, but one of the challenges with a hospital-wide resource is getting time on the system. He admitted although the work they are doing is important, it wouldn’t have priority over a number of hospital systems. And that could create delays in moving the project forward, so it became clear they wanted a dedicated resource.

They could have used a small server with 4 NVIDIA GPUs in it, but the communications between the processors is handled over the PCI express bus, which can get overwhelmed by the volume of data being passed over it. DGX Station has a proprietary connectivity interface called NVIDIA NVLink that decouples the GPUs from the PCI express slot to make data transfer faster.

One of the challenges with GPU computing is feeding the GPUs enough data to process and keep them utilized. NVLink won’t get saturated and bog down like PCI will. 

The other benefit is that everything is preconfigured and optimized, and the software and applications are tuned to the hardware. There are no drivers to installs or version of things to update.

“With the field of machine learning advancing at such a rapid pace, developer productivity is of the utmost importance," Tenenholtz said. "Therefore, any solution that allows the scientist to go from unboxing to distributed model training in minutes is highly desirable.”

As an analyst, I have seen technology waves come and go. Many years ago, the IT industry was dominated by hardware platforms. Over the past decade, “software innovation” has become the new black. I believe we are reaching a point where specialized hardware with optimized software is what’s required to move us into the AI age.

Web-scale companies like Google and Amazon have teams of people to take software and run it on commodity hardware to ensure its performing optimally. Organizations such as MGH do not have the resources to take this approach and need tools like DGX Station that just work, so they can get to work.

https://www.cio.com