Tuesday, 27 February 2018

5 Reasons Data Scientists Should Adopt DevOps Practices

As the pace of business continues to accelerate, software and data science teams find themselves under pressure to deliver more business value in less time. Software publishers and enterprise development teams have attempted to address the issue with Agile development practices which are cross-functional in nature, although Agile practices do not guarantee that the code running on a developer's machine will work the same way in production. DevOps closes the gap by promoting collaboration between development and IT operations and enabling project visibility across development and IT operations, which accelerates the delivery of better quality software.

Data scientists and data science teams often face challenges that are similar to the challenges software development teams face. For example, some of them lack the cross-functional collaboration and support they need to ensure their work is timely and actually provides business value. In addition, their algorithms and models don't always operate as they should in production because conditions or the data have changed.

"For all the work data scientists put into designing, testing and optimizing their algorithms, the real tests come when they are put into use," said Michael Fauscette, chief research officer at business solutions review platform provider G2 Crowd. "From Facebook's newsfeed to stock market 'flash crashes,' we see what happens when algorithms go bad. The best algorithms must be continuously tested and improved."

DevOps practices can help data scientists address some of the challenges they face, but it's not a silver bullet. Data science has some notable differences that also need to be considered.

Following are a few things data scientists and their organizations should consider.

Achieve More Consistent Results and Predictability

Like application software, models may run well in a lab environment, but perform differently when applied in production.

"Models and algorithms are software [so] data scientists face the traditional problems when moving to production – untracked dependencies, incorrect permissions, missing configuration variables," said Clare Gollnick, CTO and chief data scientist at dark web monitoring company Terbium Labs. "The ‘lab to real world’ problem is really a restatement of the problem of model generalization. We build models based on historical, sub-sampled data [and then expect that model] to perform on future examples even if the context changes over time. DevOps can help close this gap by enabling iterative and fast hypothesis testing [because] 'fail fast' has nice parallels to the ‘principle of falsifiability’ in science. If [a hypothesis] is wrong, we should reject [it] quickly and move on."

One reason a model may fail to generalize is overfitting, which occurs when a model is so complex that it starts finding patterns in noise. To prevent that result, data scientists use methods including out-of-sample testing and cross-validation. Those methods, which are familiar to data scientists, are part of the model-building process, according to Jennifer Prendki, head of Search and Smarts Engineering at enterprise software company Atlassian.

"The biggest challenge, model-wise, comes from non-stationary data. Due to seasonality or other effects, a model that performed well yesterday can fail miserably tomorrow," she said. "Another challenge comes from the fact that models are trained on historical (static) data and then applied in runtime. This can lead to performance issues as data scientists are not used to thinking about performance."

Enable More Consistent Processes

DevOps provides developers and IT operations professionals with visibility into what the other is doing to enable lifecycle views and approaches versus the traditional hand-offs that tend to cause dissention, finger-pointing and rework. Data scientists are involved in problem-solving lifecycles from the formation of a hypothesis to hypothesis testing, data collection, analysis and insights, but they may lack the collaboration and support they need from other parts of the organization.

"[Data scientists] may be amazing at designing and training a model, but making it production-ready, tested and deployable easily is generally something new," said Russell Smith, co-founder and CTO of quality assurance platform provider Rainforest QA. "If you find a data scientist that can do that, you've found the new unicorn."

Data failures can happen randomly over time, particularly when one is working in a system that uses external data. According to Terbium Labs' Clare Gollnick, an extreme or a problematic data point might not be observed for days or weeks, so a data-focused DevOps culture needs to rely even more heavily on continuous monitoring as part of the lifecycle and feedback loop. 

Machine learning systems are no exception because they need to be retrained on a regular basis. The frequency with which a model is trained is usually defined arbitrarily by the data scientist who developed that model, according to Atlassian's Jennifer Prendki.

"Machine Learning models are probabilistic [so] a model doesn’t suddenly ‘stop working;’ its predictions get progressively worse over time and therefore it is hard to make a call when the model needs revision," she said. "A DevOps-type approach is definitely valuable; however, the challenge is based on the ability to identify meaningful and appropriate metrics to monitor the system."

Improve Quality

DevOps and other modern software development practices including continuous delivery, emphasize the need for continuous testing. Similarly, data scientists should be monitoring models and algorithms more often than they do.

"Testing is a massive looming weak spot for data science," said Rainforest QA's Russell Smith. "Testing, especially when you're deploying changes, will give you and your team the confidence things are working as expected. Continuous testing can also help [ensure] that models that generally receive ever-changing or new content are behaving as expected, [which] is especially applicable if the models are training themselves or are re-trained. Currently, this is only happening with the most advanced teams, but it should be a much wider practice."

Testing is less straightforward in data science than it is in software development, however. For one thing, the definition of success in data science is vague, according to Terbium Labs' lare Gollnick.

"Ground truth is often not known, so there is nothing concrete to test against," said Gollnick. "We may choose instead to seek probabilistic improvement. The stochastic nature of these metrics can make automated tests difficult, if not entirely elusive. By necessity, we rely more heavily on continuous monitoring than continuous testing."

 Break Down Organizational Barriers

Developers and IT operations have traditionally been at odds because their responsibilities were divided: Developers built software, and operations ensured it ran in production. Similarly, data scientists may find themselves at odds with others in the organization, including developers and operations, which might be aided by DevOps practices.

Tensions may arise between software engineers and data scientists because their orientations differ. Clare Gollnick of Terbium Labs said data science is trying to ascertain if something works in a particular way while traditional engineering testing attempts to prove that something does work in a particular way.

Rainforest QA's Russell Smith sees friction between data scientists and operations. Unless data scientists are doing their own ops or they've embraced DevOps, someone else has to deploy, run and monitor their systems.

Advance Security

As software teams have endeavored to deliver software faster, more types of testing have "shifted left," which means developers aren't just running unit tests (which has historically been the case), they're now running other type of tests including performance testing, load testing and, more recently, security testing. The Shift Left trend doesn't mean that testers, quality assurance or security professionals aren't needed, it simply means if security is built into products from the beginning, fewer issues arise later in the software lifecycle that tend to delay software delivery.

lare Gollnick of Terbium Labs said the Shift Left movement is causing data scientists to engage more in engineering thinking than security thinking, but security may be next as the cycle continues, particularly given the importance of data security.

Atlassian addresses machine learning-related security by building models on a tenant-by-tenant basis in order to avoid accidental transfer learning that might cause information leaks from one company to another. In the case of enterprise search, it would be possible for data scientists to build models using all the data available and, based on permission settings, filter out the results a specific user is not authorized to see. While the approach may seem sound, part of the information available in the data used to train the model is actually learned by the algorithm and transferred to the model. So, either way that makes it possible for the user to infer the content of the forbidden pages.

Lisa Morgan, https://www.informationweek.com

Alibaba Cloud plays host to Red Hat Enterprise Linux

Red Hat Enterprise Linux is now globally available on Alibaba's Cloud Marketplace as a pay-as-you-go subscription model, enabling businesses to deploy and manage applications, build virtualised environments and create hybrid clouds.

The integration with Alibaba Cloud means businesses can add on the company's extra services if required, such as its desktop options and developer tools. 

“Alibaba Cloud is excited to offer Red Hat Enterprise Linux on our marketplace globally, as this is a recognition of Alibaba Cloud’s capability to provide a more secure, scalable, and stable environment to modernize businesses," Yeming Wang, general manager of Alibaba Cloud Europe said. 

"By working with Red Hat, we can provide our customers with a powerful and reliable platform capable of deploying cloud architectures to support the current and future business demands of digital transformation for enterprises.”

Alibaba Cloud completed the certification process required to re-sell Red Hat's solutions on its cloud marketplace last year, confirming it can provide a safe, scalable and robust environment for businesses. It will be available for both organisations' customers, ISVs and developers to boost their infrastructure with a cost-effective, manageable solution.

“With Alibaba Cloud’s global reach, having Red Hat Enterprise Linux on Alibaba Cloud Marketplace offers customers greater flexibility in the hybrid cloud," said Mike Ferris, vice president of technical business development and business architecture at Red Hat.

"At the heart of our business is the desire and commitment to provide a greater choice of solutions in the cloud. Working with one of the world’s leading public cloud services providers helps us take another step towards that goal.”     

http://www.cloudpro.co.uk

Demystifying Artificial Intelligence: Not Science Fiction Anymore

One of the problems in demystifying artificial intelligence is that once an AI-based product reaches the public, "we stop calling it AI," said Tara Chklovski, the CEO and founder of Iridescent, a nonprofit that aims to educate and empower children and their parents on engineering and technology matters.

Instead, she went on, "we call it GPS."

This is true. And it's a significant truth because it means the world of AI stays mired in the average person's mind as something of a science fiction-type character -- a Terminator programmed to kill, a Matrix hero designed to liberate, a Star Wars robot set to serve.

But AI is not one and the same as a robot. Simply put, AI is everywhere.

Everywhere.

And people ought to know. The average person ought to be aware.

It's guiding GPS and Google Maps. It's on Facebook. It's in Google. It's driving Spotify and Pandora. It's running at Amazon online shopping. It's on smart phones and smart TVs.

In a sense, AI is watching and recording almost all modern day human behaviors.

Ever wonder where all those recommendations come from when you buy a book on Amazon or watch a movie online? That's AI at work, taking note of your selections and processing the data to determine similarities that might interest you -- selections within the same genre, selections commonly chosen by others. What's more, this is AI with a thinking cap, honing its recommendations based on your behaviors and selections. The more you engage, the more you choose, the smarter AI grows, the more targeted its recommendations become both to you and for you.

Buy, say, a Charles Dickens book, a telescope and a new jacket on Amazon, and AI might come back the next time you log on to the site with some recommendations for more Dickens' titles, telescope tripods and similarly styled jackets. But buy six Dickens' books over the course of three weeks, all on Amazon, and AI's going to hone in on those titles and offer more from Dickens, more on Dickens himself and more from Dickens'-era authors.

That's AI at work. Specifically, that's called machine learning -- or, as Amazon puts it, "content personalization," the AI science of "using predictive analytics models to recommend items or optimize website flow based on prior customer actions."

Creepy?

In a sense.

But this is the world we live in nowadays. Like it or not, AI's only going to grow more prevalent.

Enter Chklovski and her Iridescent, trying to bridge the very wide gap between the techno-geek or science-speak and the average layperson who's completely unaware of all AI's applications. This is a long overdue endeavor; what's been sadly lacking in the field of AI is a reach-out to the layperson.

In New Orleans for a conference on AI, Chklovski helped arrange a tutorial for elementary level students at Kipp East Community Primary School and, just as importantly, their parents. This was a hands-on initiative to give students and parents a brief rundown of two AI concepts -- the processing of Big Data and the self-driving car -- and then to allow them to design and build their own models based on those ideas.

On surface, it sounds brainiac.

But it wasn't. It's not. These were average families coming together to build a paper and ping pong ball model that mimics an AI computer's parallel processing -- the way it sorts Big Data -- or to construct a self-driving car game using a circuit and LED. It took place February 3 and Kipp East was just one of 60 or so select schools across America kicking off the same 18-week event, called the AI Family Challenge.

The rationale behind the challenge is exciting. As Chklovski explained, the idea is to show both kids and their parents how AI can work in the practical, how they can build their own AI applications and how they can take those AI designs and use them in their own communities.

It's a terrific idea. We need more like it.

All schools should adopt the challenge, in fact -- or at least adapt it to fit their individual communities.

Here's why: AI's a fast-moving, massive and perceived perplexing field that screams for a deep injection of Layman -- a cooler head commonsense-type of teaching that can spread the word to even the most prolific science movie watcher that not all machine learning is evil.

After all, the more education, the less fear.

Let's demystify. Artificial intelligence should not have to be embedded in the national layperson's consciousness as an image of a killer 'bot, bent on wiping out the human race.

https://www.newsfactor.com

Samsung Unveils Galaxy S9 and S9 Plus at Mobile World Congress

Samsung Unveils Galaxy S9 and S9 Plus at Mobile World Congress

While unveiling its newest flagship smartphones at Mobile World Congress in Barcelona yesterday, Samsung also took a few digs at its competitor Apple and that company's latest device, the iPhone X.

Where Apple reinvented emojis with the iPhone X's Animoji, Samsung has developed VR Emoji, which lets users create cartoon-like, augmented reality-powered stickers and videos based on their own selfies. And while Apple eliminated the headphone jack with the iPhone 7 in favor of its own Lightning connector and wireless AirPod ear buds, Samsung said its new Galaxy S9 and S9 Plus were designed to give customers features they actually wanted.

"We deliver the innovations they want -- like an immersive screen, extended battery life, a spectrum of unlocking options, and devices with expandable storage and a headphone jack," Samsung said in yesterday's announcement at MWC18 about its new phones. "Instead of forcing people to adapt to their technology, we give our users the flexibility to choose how to incorporate technology into their lives."

'Reimagined' Camera for Visual Communication

Samsung officially took the wraps off its Galaxy S9 and S9 Plus during an on-stage "Unpacked" event at the Mobile World Congress, which runs today through March 1. DJ Koh, president of the company's mobile communications division, introduced the S9 family as "the first smartphone built for the way you use your phone today, and the only smartphone with the power to help you reimagine how you'll experience the world tomorrow."

Featuring a "reimagined," dual aperture camera, the S9 and S9 Plus enable better imaging for high-speed and low-light photography, slow-motion imaging, and after-the-fact refinements made possible through artificial intelligence, Samsung said.

All those capabilities are aimed at a population of smartphone users that is increasingly communicating visually rather than via voice or text, according to Tim Baxter, president and CEO of Samsung Electronics North America.

"Most of our communication occurs in photos, in images," Baxter said in a Samsung video about the new phones. "The S9 is designed specifically to take advantage of that."

Beyond their photo- and video-taking capabilities, the S9 and S9 Plus cameras will also let users use images -- coupled with Samsung's artificial intelligence Bixby -- to search online for products they see in real life, translate foreign languages on the fly, estimate the calories in foods, and virtually try on makeup.

Hitting the Market March 16

The new S9 devices will also be Samsung's first Galaxy smartphones to come with stereo speakers. In addition, they each feature a Quad HD super AMOLED display, a near-bezel-free Infinity Display, water resistance, an iris sensor for biometric security, and wired and wireless fast charging.

The 5.8-inch, 64GB Galaxy S9 will come with 4 GB of RAM and a 3000mAh battery, while the 6.2-inch, 64 GB Galaxy S9 Plus provides 64 GB of RAM and a 3500mAh battery. Both are powered by Android Oreo 8.0, and both phones will be available in four colors: lilac purple, midnight black, titanium gray, and coral blue.

https://www.newsfactor.com

Available for customers in the U.S. via reservation today, the Samsung Galaxy S9 and S9 Plus are set to arrive March 16. The S9 is priced starting at $719.99, while the S9 Plus will go for $839.99 and up.

Intel expects PCs with fast 5G wireless to ship in late 2019

With the first deployments of 5G high-speed wireless technology within the U.S. scheduled for later this year, Intel and its PC partners are already thinking about the next step: rolling out 5G-equipped PCs late in 2019.

Intel, along with Dell, HP, Lenovo, and Microsoft said Thursday that the companies expect the first 5G Windows PCs to become available during the second half of 2019. That’s about the same time that Intel plans to begin shipping its XMM 8000 commercial modems, marking the company’s entrance into the 5G market.

Intel will show off a prototype of the new 5G connected PC at Mobile World Congress show in Barcelona. In addition the company will demonstrate data streaming over the 5G network. At its stand, Intel said that it will also show off eSIM technology—the replacement for actual, physical SIM cards—and a thin PC running 802.11ax Wi-Fi, the next-gen Wi-Fi standard.

Though 5G technology is the mobile industry’s El Dorado, it always seems to be just over the next hill. Intel has promoted 5G for several years, saying it will handle everything from a communications backbone for intelligent cars to swarms of autonomous drones talking amongst themselves. 

Carriers, though, have started nailing down when and where customers will be able to access 5G technology. AT&T said Wednesday, for example, that a dozen cities including Dallas and Waco, Texas, and Atlanta, Georgia, will receive their first 5G deployments by year’s end. Verizon has plans for three to five markets, including Sacramento, California. T-Mobile and Sprint are taking it a bit slower, with T-Mobile planning its first 5G deployments in 2019. Sprint, too, is aiming for its first 5G presence to debut next year.

For Intel, 5G offers a rebirth of sorts for its wireless business. Having being forced to sit out much of the 4G wave because of advanced, integrated platforms from its competitors, 5G is essentially an opportunity for Intel to start fresh. 

“Intel is investing deeply across its wireless portfolio and partners to bring 5G-connected mobile PCs to market, with benefits for users like high quality video on-the-go, high-end gaming, and seamless connections as users traverse WiFi and Cellular networks,” the company said in a statement.

That’s the reason Intel is also invested in bringing 802.11ax to market, the next improvement in Wi-Fi technology. Like 5G, 802.11ax—which promises Wi-Fi speeds across multiple gigabits per second—has yet to be finalized, let alone deployed.

That still hasn’t stopped so-called “pre-standard” hardware, like D-Link’s new routers. Once the 802.11ax standard is fully baked in 2018 or early 2019, compliant networking gear can begin to roll out and undergo interoperability testing. (Typically, pre-standard wireless technology attempts to bring those products in line with the final spec via a firmware update.) In the meantime, though, Intel is shoveling coal into the next-gen hype train.

What this means for you: Keep an eye out for smartphone vendors to begin talking up their own 5G plans once Mobile World Congress show begins February 26. The higher-speed modems will begin showing up in hardware rolling out this year and next, with smartphone makers being forced to choose between tried-and-true 4G modems and more advanced (and possibly more power-hungry) 5G radios on the horizon. PC vendors will follow closely behind.

http://computerworld.in

IoT, Cloud have the biggest potential to disrupt IT risk management: Survey

MetricStream, the independent market leader in governance, risk, and compliance (GRC) apps and solutions, announced the findings of its latest survey, Moving Up the IT Risk Management Maturity Curve: An In-Depth Look at How Enterprises Are Managing and Mitigating Their IT Risks. The survey covered 139 respondents from 120+ enterprises and 20 industries across more than six geographic regions to examine the maturity of IT risk management programs, methods used to assess IT risks, impact of IT regulations, and the role of technology in managing IT risks.

Disruptive technologies

The respondents reported that Internet of Things (IoT) technologies have the biggest potential to disrupt IT risk management programs, followed by the cloud, Bring Your Own Device (BYOD), fintech, and blockchain. These rapidly evolving technologies enable businesses to create significant value, but they also bring new security and privacy risks that are still not fully understood.

IT GRC solutions as maturity enablers

Respondents were asked to rate their IT risk management maturity on the 5 level Capability Maturity Model Integration (CMMI) scale. 75 percent of the respondents who have implemented IT GRC solutions reported a CMMI maturity level of 3 or higher. In comparison, only 38 percent of the respondents who have not implemented IT GRC solutions reported a similar level of maturity – these organizations rely on a combination of spreadsheets, point solutions, and other tools which, going by the survey results, may not be as effective as IT GRC solutions in improving the maturity of IT risk management programs.

Top IT threats

The top 5 IT threats and risks that respondents reported facing in the last two years are malware infections, security breaches, compliance violations and regulatory actions, account phishing, and spoofs of company executives.

Maturity and potential vulnerabilities

Respondents reported relatively high levels of maturity (CMMI level 3 or higher) in IT risk identification and assessments, standardized documentation of processes and controls, control design and assessments, and IT risk monitoring and reporting.

However, when it came to IT risk management training, 51 percent of the respondents, and 52 percent of those in banking and financial services, reported a CMMI maturity of only level 1 or level 2. This lack of maturity could prove disastrous, as poorly trained employees fall prey more easily to social engineering attacks such as phishing which, in turn, open the door to larger attacks on enterprise security.

Despite these risks—and the fact that account phishing and spoofing were cited among the top 5 enterprise IT threats—it appears that respondents are not investing enough in employee training programs or better information governance frameworks. Instead, the top priority for respondents in the Middle East, Asia-Pacific, and Africa is investing in IT security solutions, while in Europe, the US, and Canada, the primary focus is on compliance with government regulations.

“Guarding against the next Equifax-style cyber-attack will require enterprises to have holistic, agile IT risk management programs,” said French Caldwell, chief evangelist, MetricStream. “An IT GRC software solution can really add value by automating workflows, and providing timely risk intelligence to guide decisions. However, it’s just one piece of the pie. Policies, training programs, and information governance frameworks are all equally important. Together, they lay the foundation for a resilient and secure enterprise.”

http://cio.in/

Riverbed delivers enhanced SD-WAN, cloud networking solution

Riverbed Technology, announced major updates to its SD-WAN and cloud networking solution, SteelConnect, the first and only SD-WAN solution that provides unified connectivity and policy-based orchestration spanning the entire distributed network fabric – hybrid WAN, branch LAN/WLAN, data centers, and the cloud. With the latest upgrades to SteelConnect, Riverbed is expanding the power of one-click connectivity and optimization into AWS and Microsoft Azure with added support for AWS Direct Connect and Azure ExpressRoute. Riverbed is also introducing seamless integration between SteelConnect and Riverbed Xirrus Wi-Fi with zero-touch provisioning of Riverbed Xirrus access points, and is providing new deployment flexibility with the addition of LTE wireless options for SteelConnect SD-WAN gateways.

Enterprises are striving to adopt more cloud-based applications and services in order to accelerate digital transformation, increase business agility and reduce costs. However, the resulting hybrid IT landscape often increases operational complexity and introduces new risk to security breach across the network. According to a recent ESG survey, 91 percent of companies surveyed agree that incorporating cloud-based applications into their portfolio of corporate applications has increased the complexity associated with managing remote and branch offices (ROBOs).The pervasive use of mobile devices and the rise of IoT further challenges IT’s ability to secure and manage network access with an exponential increase in the number of devices and end-points at the network edge.

To address these challenges, Riverbed SteelConnect helps businesses transform their approach to networking, with a new software-defined and cloud-centric approach that yields dramatic gains in agility, efficiency and flexibility so that businesses can reach their cloud and digital goals faster. Riverbed SteelConnect has gained significant momentum over the last couple years, with hundreds of enterprise customers across all industries benefitting from an innovative and more comprehensive approach to software-defined networking.

“Conventional approaches to networking are both out-of-date and out-of-sync with modern use cases, and are failing to provide the agility and efficiency that IT professionals need in order to support their businesses, especially as they embark on digital and cloud initiatives,” said Paul O’Farrell, senior vice president and generamanager of Riverbed’s Cloud Infrastructure Business Unit. “Riverbed SteelConnect is helping organizations break through conventional limitations by taking the power of zero-touch provisioning and policy-based orchestration across the entire distributed fabric – WAN, LAN/WLAN, data center and the cloud. With new enhancements for integrated Wi-Fi, automated cloud networking and flexible connectivity options, Riverbed is once again raising the bar for networking in the cloud and digital era with the most complete and easy-to-use solution in the market today.”

Enhanced automated cloud connectivity for greater flexibility. Riverbed SteelConnect now supports AWS Direct Connect and Azure ExpressRoute, providing customers with increased deployment flexibility. With this capability, customers can securely connect to their cloud resources over public internet or dedicated private links for automated cloud connectivity and improved performance through single-click cloud automation.

Unified management of SD-WAN and Wi-Fi. Riverbed SteelConnect SD-WAN now seamlessly integrates with Xirrus Wi-Fi for simple deployment and management of cloud networks. Customers can manage SD-WAN and Wi-Fi together through a centralized cloud console, including zero-touch provisioning of Riverbed Xirrus Wi-Fi Access Points, creation of Wi-Fi networks, and Wi-Fi monitoring.

“As Australia’s largest and most iconic lighting retailer, it is essential we keep pace with the digital age, and still provide an optimal in-store experience for our customers,” said Mick Tan, CIO at Beacon Lighting. “Riverbed SteelConnect appealed to us because of the ability to simply deploy and manage Wi-Fi and SD-WAN together from a single cloud console. With these capabilities, we can streamline and reduce the management cost of our nationwide store network infrastructure, and easily prioritize network traffic for point-of-sale, dynamic marketing, and stock management systems, ensuring business continuity and continuous revenue generation is maintained. SteelConnect also simplifies the roll out of new stores as Beacon Lighting’s business expands, and provides reliable in-store Wi-Fi for a modern, digital shopping experience that our customers demand.”

Supported LTE wireless for maximum reach and connectivity. Riverbed SteelConnect now supports LTE uplinks for a variety of use cases including back-up network connectivity in retail stores, pop-up stores, rural sites, or mobile retail. By leveraging LTE, retailers can maximize reach and productivity while increasing resiliency and agility of the network.

Setting the pace for network innovation in the cloud era. SteelConnect, initially launched as an early access offering in April 2016, is the industry’s first and only product that provides unified connectivity and policy-based orchestration spanning the entire network – LAN/WLAN (including cloud-based Riverbed Xirrus Wi-Fi), WAN, data center and the cloud, with one-click connectivity and optimization into AWS and Microsoft Azure. SteelConnect also enables zero-touch provisioning, allowing an enterprise to set-up a global network and connect to the cloud in minutes, and easy ongoing network management that provides the ability to make network or business/application policy changes with a few clicks of a mouse.

Since the initial launch of SteelConnect, Riverbed has continued a rapid pace of innovation, introducing a broad set of unique capabilities, including integration with SteelCentral for visibility and insight, integration with the market-leading WAN Optimization solution (Riverbed SteelHead SD), and following the strategic acquisition of Xirrus (leading cloud-based Wi-Fi company) in May 2017, enterprise-grade Wi-Fi – bringing the power of policy-based network management out to the wireless edge. SteelConnect also seamlessly integrates with third-party network security services, such as Zscaler’s cloud security platform, to help IT professionals achieve agility, performance and security across the distributed enterprise.

http://www.computerworld.in

What’s new in Google’s Dart language

Google’s Dart language, once positioned a potential replacement for JavaScript in the browser, is being rebooted for client-side web and mobile development in Version 2 of the language. A beta version is now available.

Dart 2 features a strengthened type system, a cleaned-up syntax, and a rebuilt developer tool chain. Dart has a succinct syntax and can run on a VM with a just-in-time compiler, with the compiler enabling stateful, hot reload during mobile development.

Developers also gain from fast development cycles where code can be edited, compiled, and replaced in apps running on a device. Compiling code ahead of time provides fast startup, Google said.

Dart can be compiled to native code for ARM and x86 platforms. Google has used the language to build applications for iOS, Android, and the web.

Dart 2 features

Capabilities for client-side development in Dart 2 include:
  • Strong typing to catch bugs earlier, boost quality, and improve applications built by large teams.
  • Defining the UI as code so the need for context switching between a UI markup language and the programming language is reduced.

The language also has web-specific libraries such as dart:html and a full web framework.

Where to download Dart 2

You can try the beta of Dart 2 in Flutter by downloading it GitHub or via the Dart SDK.

https://www.infoworld.com

Saturday, 24 February 2018

Reltio Cloud Brings Continuous Data Organization to Enterprises

As cloud services become more sophisticated, they also become more granular and specialized. As of Feb. 21, add another smart cloud to a growing list of options for enterprises.

Reltio, known for developing something called the Self-Learning Data Platform, launched Reltio Cloud 2018.1, a new version of the native cloud platform that organizes enterprise data for continuous learning and updating. It constantly rearranges itself, reports on the condition of a company’s data and even offers suggestions on ways to be more efficient.

The release includes the next evolution of Reltio IQ--formerly Reltio Insights--which uses advanced analytics and machine learning for day-to-day operations and applications. A key feature of Reltio IQ, the company said, is the ability to derive IQ scores or recommendations for efficient data organization and for business insight, and to embed them back into customer, account or product profiles for easy segmentation and operational execution.

The platform offers continuous data organization, recommended actions and measurable results. Five key things it does are:
  • make data the heart of every decision;
  • organize data of all types at unlimited scale;
  • unify data sets while providing unlimited personalized views;
  • infuse analytics into operational business processes; and
  • continuously learn about customers, products and their relationships.
Reltio Cloud 2018.1 continues to organize billions of multi-domain profiles, relationships, and interactions at petabyte-scale, and is being used for day-to-day operational activities by thousands of IT and business users globally, the company said.
This new release includes:
  • Seamless analytics and machine learning: Reltio IQ processes data, organized and consolidated within Reltio Cloud and the Reltio Self-Learning Graph to provide a trusted foundation of reliable data for analytics and machine learning. An on demand Apache Spark environment allows data scientists to spend their time tuning algorithms via their tools of choice, rather than aggregating and cleaning data. Reltio IQ then seamlessly synchronizes IQ attributes and recommendations back to profiles for use in operational Data-Driven Applications. IQ scores can also be benchmarked anonymously across industry segments to give companies greater insight into where they stand relative to their peers.
  • Advanced master data management and reference data management: Match IQ provides advanced statistics together with new Proximity, Cross-attribute, and Groups/Household rules to improve profile consolidation. Reference Data Management, included as core Reltio Cloud functionality, has a new UI for mapping and translation from multiple sources.
  • Increased platform agility and performance: Export IQ, an ultra-fast export, now supports flexible filtering and billions of high performance API calls per day, for real-time business operations.
  • Continuous compliance: Enhanced General Data Protection Regulation (GDPR) compliance has built-in features to execute “right to be forgotten” requirements through audit and history of contributing source indexes, as well as deletion of data on demand.
  • Actionable statistics and simplified configuration: Reltio Console supports the ongoing management and health of the Reltio Cloud environment, and the underlying data assets being organized. The Console has improved statistics, and configuration validation to proactively recommend and enforce best practices, as well as an upgrade to the Reltio UI Modeler to further customize data-driven application profile pages.
The latest version of the Reltio Self-Learning Data Platform will be showcased Feb. 26 and 27 by customers and partners at DataDriven18, the Modern Data Management Summit in San Francisco, one of the largest gatherings of data management and machine learning experts.
http://www.eweek.com

Qualcomm Offers Previews of Its 5G Chipsets, Services

Qualcomm Technologies is juggling a few things right now, and they all are being managed here at its headquarters in a glorious, sun-drenched vacation destination, set a few minutes north of San Diego off Interstate 5.

Item No. 1: Processor software competitor Broadcom is trying to pull off a hostile takeover of the $96 billion company with its “latest and best bid” at $82 per share (current stock value: $65), claiming that Qualcomm’s management is ineffective at just about everything it does. With Broadcom in charge, it claims, shareholders will realize a lot more from their investments.

News update: On Feb. 8, Qualcomm’s board rejected the bid for a second time (first turn-down was a $70 bid back in November). Singapore-based Broadcom is a stubborn player, however, and may continue its unsolicited takeover move. Meetings are happening again this week (Feb. 15). Stay tuned.

Item No. 2: On the legal front, Qualcomm three weeks ago was slapped with a $1.2 billion antitrust fine by the European Commission for allegedly making billions of dollars in payments to Apple in exchange for exclusively using only Qualcomm's LTE chipsets in iPhones from 2011 to 2016. The fine followed an investigation by the European Commission (EC), which found that Qualcomm's payments were made to keep Apple from using similar parts from competing vendors, according to the EC.

The antitrust agency claims Qualcomm abused its dominant market position in making the payments to Apple and that the payments blocked consumers from access to product choices and innovation in their device purchases. We’ll have to wait and see how the ensuing litigation—which could take years--turns out.
Heavy stuff, for sure. Despite all this, there is good news for the company and its customers. Hint: It’s about the future of the product line.
Item No. 3: Qualcomm has been doing a substantial amount of research and development work on 5G connectivity for several years. CEO and board member Steve Mollenkopf wasn’t in attendance at the 5G event (understandably), but other Qualcomm executives told a gathering of about 100 international tech journalists and analysts that they believe the company is one to two years ahead of everybody else in developing silicon for 5G--the technology that’s said to be a full 100 times faster than 4G and that will prove to be critical to the development of self-driving cars.
The company on Feb. 6 and 7 welcomed journalists and analysts from the States and as far away as Egypt, France, Germany, the U.K., Japan, India and China. It offered insights into early-stage use cases to which 5G chips, chipsets and modems will be pointed; revealed an international group of 18 telecom partners who are busy investing in updating their systems in anticipation of the new hardware; and previewed a new set of 5G wireless services that are now being developed.
Why 5G is Such a Huge Step Forward
Why is 5G going to be such a major step forward, as opposed to previous iterations from 2G, to 3G to 4G? Well, there are many reasons—a key one being that a 5G phone will never need Wi-Fi for anything.
Qualcomm has enticed a number of telcos to start testing its Snapdragon X50 5G modem for use in live, over-the-air mobile 5G NR (for New Radio) trials with multiple global wireless network operators in both the sub-6 GHz and millimeter wave (mmWave) spectrum bands. The 18 telcos in the program are AT&T, British Telecom, China Telecom, China Mobile, China Unicom, Deutsche Telekom, KDDI, KT Corporation, LG Uplus, NTT DOCOMO, Orange, Singtel, SK Telecom, Sprint, Telstra, TIM, Verizon and Vodafone Group.
The trials are based on the 3GPP Release 15 5G NR standard. The mobile 5G NR trials will utilize Qualcomm Technologies’ 5G mobile test platform and smartphone reference design, which incorporate the Snapdragon X50 chipset and optimize 5G technology within the power and form factor constraints of a smartphone while maintaining interoperability and coexistence with 4G LTE, the company said.
Starting Use Cases for 5G
Use cases for 5G will include enhanced mobile broadband to smartphones; always connected PCs; head-mounted displays for virtual reality, augmented reality and extended reality hardware; and Mobile Broadband, all of which require constant and consistent cloud connectivity. Details include:
  • 5G-enabled smartphones: With 5G, consumers will never again need to log on to public Wi-Fi. They will also enjoy faster browsing, faster downloads, better quality video calls, UHD and 360-degree video streaming and instant cloud access than currently available.
  • Always-connected PCs: With the advent of 5G networks, "Always Connected" PCs will be able to utilize super high speed, low latency connectivity for the next level of cloud services, as well as high-quality video conferencing, interactive gaming and increased productivity due to the flexibility to work anywhere.
  • HMDs: 5G enhanced mobile broadband will further elevate virtual reality (VR), augmented reality (AR) and extended reality (XR) experiences with its increased capacity at lower cost and ultra-low latency - down to 1 millisecond.
  • Mobile broadband: Fiber speeds and massive capacity to support insatiable consumer demand for unlimited data, as well as superior mobile and home broadband internet access.
These products won't be seeing release in the market until 2019 at the earliest, Qualcomm executives said.
With all this new connectivity, there will certainly be a lot of professional services needed to aid with connecting standard laptops, phones, self-driving cars, drones, industrial equipment and other devices operating on the edge of networks, collecting data and using cloud services. 5G will help greatly to expand cell phone capability to billions of people while also extending functionality to trillions of things.
Qualcomm will be ready with these services when the time comes. These new wireless edge services will be available on select Qualcomm chipsets, initially the MDM9206 LTE modem for industrial IoT, MDM9628 for automotive products, and QCA4020 for home IoT products, with expansion to Snapdragon platforms later. They are designed to help enterprise users connect their devices in a secure and trusted manner with the ability for large scale deployment performed efficiently.
Qualcomm Unveils New Chipset for LTE Modems
Qualcom this week also announced a new LTE chipset modem for mobile devices, the Snapdragon X24. 
For the record, Qualcomm is an American fabless semiconductor company that designs, manufactures, and markets digital wireless telecommunications products and services. It produces processors and connectivity solutions, modems, displays, software and charging products.
Qualcomm is focused on various industries: automotive, education, healthcare, internet of things, mobile computing, networking, smart cities, smart homes and wearables. The company's products include Gobi, Hy-Fi, IPQ, IZat, Powerline, Snapdragon, Small Cells, VIVE, Wi-Fi platforms, Mirasol, Pixtronix, AllPlay, 2net, Brew, HealthyCircles, QChat, QLearn, RaptorQ, Vuforia, Halo, and WiPower.
http://www.eweek.com

What’s new in Apache’s NetBeans IDE for Java 9

The Apache Software Foundation has released a beta of its NetBeans Version 9.0 IDE, with support for the Java Module System introduced with Java 9 last year. Modules comprised the premier capability in JDK 9, which was released in September 2017.
  • ModulePath mode to enable the use of modules, in addition to supporting the longstanding classpath option for the runtime to search for classes and resource files.
  • The ability for a standard NetBeans project can serve as a Java Development Kit 9 module via a module-info.java file in the default package.
  • Support in modules for the full Edit-Compile-Debug-and-Profile cycle.
  • The ability to show module dependencies in the IDE.
  • A console-like UI  for the Java Shell (JShell) REPL (read-eval-print-loop) tool, which can be supported with the user project configuration.
  • Added actions in the Java profiler to expand and collapse nodes in tree table results.
  • Resizable popups in the profiler, to make it easier to handle long class or method names.
  • PHP 7.1 support, including class constant visibility, multicatch exception handling, and nullable types.
  • For PHP 7.0 development, a context-sensitive lexer.
  • Also for PHP, editor hints for void return types and incorrect nonabstract methods.
  • Code-folding for arrays .
  • The C/C++ debugger for native dbx debugging.
  • Support in the C/C++ editor for the Clang-format formatting tool.
  • Also for C/C++ development, an experimental version of Clank-based diagnostics, which show the error path of a problem.
NetBeans 9.0 also adds a new project, Java Modular Project, for developing several JDK 9 modules in one Ant-based project. With it, Java modular app projects can be packaged into a JLink image for distribution of the application and required modules.

Where to download the NetBeans Version 9.0 beta

You can download the NetBeans IDE 9.0 beta from the Appache mirror site.
https://www.infoworld.com

Docker tutorial: Get started with Docker Compose

Containers are meant to provide component isolation in a modern software stack. Put your database in one container, your web application in another, and they can all be scaled, managed, restarted, and swapped out independently. But developing and testing a multi-container application isn’t anything like working with a single container at a time.

Docker Compose was created by Docker to simplify the process of developing and testing multi-container applications. It’s a command-line tool, reminiscent of the Docker client, that takes in a specially formatted descriptor file to assemble applications out of multiple containers and run them in concert on a single host. (Tools like Docker Swarm or Kubernetes deploy multi-container apps in production across multiple hosts.)

In this tutorial, we’ll walk through the steps needed to define and deploy a simple multi-container web service app. While Docker Compose is normally used for development and testing, it can also be used for deploying production applications. For the sake of this discussion, we will concentrate on dev-and-test scenarios.

Docker Compose example

A minimal Docker Compose application consists of three components:
  1. A Dockerfile for each container image you want to build.
  2. A YAML file, docker-compose.yml, that Docker Compose will use to launch containers from those images and configure their services.
  3. The files that comprise the application itself.

In our example, we’re going to create a toy web messaging system, written in Python using the Bottle web framework and configured to store data in Redis. If used in production app, it would be horrendously insecure and impractical (not to mention underpowered!). But the point is to show how the pieces fit together, and to provide you with a skeleton you could flesh out further on your own.

To obtain all the pieces at once, download and unpack this docker-compose-example.zip file into a working directory. You will need to have the most recent version of Docker installed; I used version 17.12.01, which was the latest stable version at the time of writing. Note that this tutorial should work as-is on Docker for Windows, Docker for Mac, and the conventional Linux incarnation of Docker.

The Dockerfile included in the bundle is simple enough:
FROM python:3
ENV PYTHONBUFFERED 1
ADD . /code
ADD requirements.txt /code/
WORKDIR /code
RUN pip install -r requirements.txt
CMD python app.py
This set of commands describes an image that uses the stock python:3 image as its base, and uses two files (both included in the .zip bundle): requirements.txt and app.py. The former is used by Python to describe the dependencies of an application; the latter is the Python application itself.

The elements of the docker-compose.yml file are worth examining in detail:

version: 3
services:
redis:
image: redis
web:
build: .
command: python3 app.py
volumes:
- .:/code
ports:
- 8000:8000
depends_on:
- redis
The version line specifies the version of the Docker Compose file format to use. Docker Compose has been through a number of revisions, each tied to specific versions of the Docker engine; version 3 is the most recent as of this writing.

The services section defines the various services used by this particular container stack. Here, we’re defining two services: redis (which uses the redis container image to run the Redis service), and web (our Python application). Each service descriptor provides details about the service:
  • build: Describes configurations applied at build time. It can be just a pathname, as shown here, or it can provide details such as a Dockerfile to use (instead of the default Dockerfile in the directory) or arguments to pass to the Dockerfile during the build process.
  • command: A command to run when starting the container. This overrides the CMD statement supplied in the container’s Dockerfile, if any.
  • volumes: Any paths to volumes to be mounted for this service. You can also specify volumes as a top-level configuration option and re-use any volumes defined there across multiple containers in the docker-compose.yml file.
  • ports: Port mappings for the container. You can use a simple format, as shown here, or a more detailed format that describes which protocols to use.
  • depends_on: Describes the order of dependencies between services. Here, because web depends on redis, redis must be brought up first when Docker Compose starts the app.

There are many more options available in services, but these few are enough to get a basic project started.

Continue here: goo.gl/6u4vHv

www.infoworld.com