Category

iot

Getting Started with Intel Edison: Sensors, Actuators, Bluetooth, and Wi-Fi on the Tiny Atom-Powered Linux Module (Make : Technology on Your Time)

By | iot, machinelearning

The Intel Edison is a crowning achievement of Intel’s adaptation of its technology into maker-friendly products. They’ve packed the dual-core power of the Atom CPU, combined it with a sideboard microcontroller brain, and added in Wi-Fi, Bluetooth Low Energy, and a generous amount of RAM (1GB) and flash storage (4GB). This book, written by Stephanie Moyerman, a research scientist with Intel’s Smart Device Innovation Team, teaches you everything you need to know to get started making things with Edison, the compact and powerful Internet of Things platform.

Projects and tutorials include:

  • Controlling devices over Bluetooth
  • Using Python and Arduino programming environments on Edison
  • Tracking objects with a webcam and OpenCV
  • Responding to voice commands and talking back
  • Using and configuring Linux on Edison

$13.91



How to Start Incorporating Machine Learning in Enterprises

By | iot


The world is long past the Industrial Revolution, and now we are experiencing an era of Digital Revolution. Machine Learning, Artificial Intelligence, and Big Data Analysis are the reality of today’s world.

I recently had a chance to talk to Ciaran Dynes, Senior Vice President of Products at Talend and Justin Mullen, Managing Director at Datalytyx. Talend is a software integration vendor that provides Big Data solutions to enterprises, and Datalytyx is a leading provider of big data engineering, data analytics, and cloud solutions, enabling faster, more effective, and more profitable decision-making throughout an enterprise.

The Evolution of Big Data Operations

To understand more about the evolution of big data operations, I asked Justin Mullen about the challenges his company faced five years ago and why they were looking for modern integration platforms. He responded with, “We faced similar challenges to what our customers were facing. Before Big Data analytics, it was what I call ‘Difficult Data analytics.’ There was a lot of manual aggregation and crunching of data from largely on premise systems. And then the biggest challenge that we probably faced was centralizing and trusting the data before applying the different analytical algorithms available to analyze the raw data and visualize the results in …

Read More on Datafloq

Source link

The Next Generation of Infrastructure Services

By | iot

Today’s environment is one where business needs are rapidly changing and business cycles are accelerating faster than ever. To grow successfully, businesses are requiring improved efficiencies from the digitalization of existing processes and the ability to rapidly launch new digital offerings.  Enterprises need consistent and reliable support to run their IT systems across a “supply chain” of services from multiple vendors.  As IT evolves to meet this demand, CIO’s are making decisions every day about how to transform application and infrastructure architectures, ensure the right workload runs on the right cloud, all while reducing IT costs. There is no compromising choice for reliability. Both play an equally important role in a business success, and they need the services they are using to continuously evolve.

For the IT services industry, this means a shift away from a systems integration model that focused on IT outcomes to services integration model that focuses on business outcomes. What has traditionally been a people-led, technology-assisted approach that relied on people-based processes using technology for executing simple tasks, has progressed to one that is technology-led, people-assisted, using advanced technology to address complex tasks. In other words, most standard tasks and processes that once required a manual administration will be executed by increasingly autonomic systems that are data-driven, cognitive and automated.

We are in a new Cognitive Era of technology where IT systems and services are fully integrated, where tasks and processes are being delivered by increasingly autonomic systems that are data driven. IT systems and services are becoming cognitive and software-defined, able to communicate among themselves and act autonomously. In the Cognitive Era of IT, cognitive computing is transforming the Digital Enterprise and we believe cognitive services delivery is the next stage of the delivery lifecycle transformation For example, a growing number of companies are using the IBM Services Platform with Watson and taking advantage of the wealth of operational data and experience. With this cognitive service, they’re gaining insights from their IT systems that either direct automated actions that result in better business outcomes, or that advise human experts to make better, data-driven decisions.

Using a simple analogy from biology, cognitive technologies like IBM Watson act as the cognitive brain (to generate the insights) and automation serves as the muscle (that executes the action). Information about the infrastructure systems we manage using the cognitive technologies is continuously fed back into the ‘brain’ to enable further learning and produce better outcomes. The infrastructure systems infused with the cognitive capabilities begin to understand, reason and learn, thus becoming self-healing and self-optimizing. In other words, they become autonomic systems.

Our cognitive services delivery is enabled by our IBM Services Platform with Watson that is comprised of IBM’s Data Lake, Cognitive Delivery Insights engine powered by Watson, and client-centric dashboards that provide full visibility into a client’s IT environment.

We expect that increasingly autonomic behavior will emerge in the context of business services or applications, with performance or user experience goals based on security and economic value guiding the behavior of the infrastructure services. Additionally, augmenting application performance management and event management systems with pattern recognition and deep learning is a critical part of the cognitive, optimized infrastructure.

Underlying federated ontologies will govern the interaction or the ‘glue’ among the different levels of cognitive capabilities and enable micro and macro learning mechanisms to support each other. For instance, learning that does not yet lead to automatic actions can still feed into the corpus of knowledge aiding a human expert, providing meaningful insights to help the practitioner with better decision making.

In the emerging hybrid cloud world, all of the cognitive services on the IBM Services Platform with Watson are designed, built and run with a workload perspective as the driving force. They rely on software-defined environments that surface the properties of hardware devices through software interfaces. This enables the right workload to be orchestrated and directed to run on the right cloud.

Cognitive services delivery and integration requires a platform to mediate between IT consumers and service providers, to bridge between the business perspective represented by workloads and their owners, consumers, and the services supporting them.

This Platform is evolving from our existing investments in service brokerage, orchestration for hybrid clouds and advanced automation and analytics for the operational lifecycle.

The platform has three layers:

  1. A broker layer for governance of IT consumption, supported by a federated self-service catalog. This allows CIOs to have visibility and control over who uses which services, while giving the users convenient access to services provided by potentially multiple providers from their services supply chain. Our IBM Brokerage Services provide leading functionality and are continuously being enhanced.
  2. An orchestration layer that ensures automated fulfilment and integration of the services across multiple providers, using blue-prints or patterns of configurations that embody best practices.
  3. An operational lifecycle layer that provides services management driven by automation and analytics. The Platform uses analytics and cognitive technologies on operational data collected to continually improve the quality of services delivery, coupled with automation for increasingly autonomic behavior for end to end security and improved efficiency.

The IBM Services Platform with Watson is data-driven, cognitive and automated, fueled by a vast data lake, a powerful analytics engine and cognitive services spanning all the layers and the entire lifecycle of the IT environment. Managed by this Platform, our clients enjoy a higher quality of service, better visibility and improved spend control of their IT environment.  The next generation of infrastructure services is delivered by the IBM Services Platform with Watson.

The post The Next Generation of Infrastructure Services appeared first on THINK Blog.



Source link

Securing Devices in the Internet of Things

By | iot

Anyone doing IoT would agree that security is paramount for safe and secure operation of devices. Devices are one of the most critical elements in the Internet of Things (IoT). Security breaches at the device level could result in severe damage including financial losses, loss of credibility and trust and in some extreme cases even human danger. However, a number of high profile cases with large scale organizations involved have indicated that these breaches are not the result of one access point getting compromised; rather it is a result of multiple points of failure. In this scenario ensuring that even one of these access points becomes impenetrable would mitigate these breaches and in turn minimize the damage caused. Although, there are several challenges in designing security measures within these devices; developers need to be able to identify “just enough” security measures that need to be put in place.

Devices are the point of contact for human interaction and are responsible for generating the data on which the system relies. The security of these devices can be difficult at times as these devices are vulnerable to not just network borne threats but to physical tampering as well. This makes it pivotal for developers to address security issues at the time of designing of these devices. Even though there might be many security measures that developers can put into place, it is important to identify the accurate amount of measures that would be most effective and should be put in place.

While designing security in devices, developers come across a variety of challenges; take for instance an embedded device, with a small footprint and limited computing resources, heavy security measures would hinder the performance of this device. However, too little security might leave loopholes for breaches. This makes it important for developers to be able to identify that specific security measure which would qualify to be “just enough”. In order to identify this “just enough” security, these developers rely on three criteria:

1. The environment where the device has been deployed? – is it within closed door in a secured location, or out in the open in a public environment.
2. How is the device connected and communicating with the network? – is it connected to a public or a private network, is it behind a firewall, is there any form of encryption employed.
3. The type of data the device is storing? – The sensitivity of the data been stored by the device.

Based on the answers to these questions, developers can identify the appropriate security measures to be integrated with their devices. However, it is always preferred to have access to an operating system that would enable the user to choose the most suitable security measures from a set of options given to them.
Although these criteria allow the developer to identify the security measures to be put in place, they also need to ensure that measures are implemented at every stage of the device lifecycle, from the initial design to the operational environment.

Design phase: it is critical to ensure that no malicious code gets introduced while the development process is underway. This would be possible through signed binary delivery, assuring the authenticity and non-alteration of code, and developing on a software platform that has been certified under industrial security standards such as IEC 62443 and IEC 27034.

Execute phase: the aim is to ensure that the right software is in place on the right hardware and that they trust each other. This root of trust can be established by using secure boot technology and cryptographic key signatures to prevent unsigned code from getting executed.

Operate phase: multiple measures can be deployed to prevent malicious attacks in operation mode, including controls to prevent unauthorized access and securing networks using encryption.

• Power Down phase: when the device is at rest, measures such as encrypted storage and secure data containers should be in place to prevent on board data access.

A much talked about example for this is a recent security breach suffered by a major retailer in the US that resulted in the theft of millions of customers credit and debit card details. The breach took place by compromising the Point of Sale (POS) devices to gain access into the network and capture the customer data in real time. A thorough breakdown of the incident showed how the breach was possible because multiple access point failures, starting from the HVAC systems not being isolated which gave the hackers direct access to the POS systems that existed on the same networks. Through these POS systems the hackers gained unhindered access to the cash registers, where they reverse engineered the code to gain real time data of the customer’s credit and debit card credentials every time someone would make a purchase. The depth to which these hackers had gained access would have ensured they remained invisible had outside investigators not discovered the anomalies and alerted the retailer.

The post Securing Devices in the Internet of Things appeared first on Internet Of Things | IoT India.

Source link

The Complete Guide to TensorFlow 1.x

By | iot, machinelearning

Become an expert in machine learning and deep learning with the new TensorFlow 1.x

About This Book

  • Learn to implement TensorFlow in production
  • Perform highly accurate and efficient numerical computing with TensorFlow
  • Unlock the advanced techniques that bring more accuracy and speed to machine learning activities
  • Explore various possibilities with deep learning and gain amazing insights from data

Who This Book Is For

Are you a data analyst, data scientist, or a researcher looking forward to a guide that will help you increase the speed and efficiency of your machine learning activities? If yes, then this course is for you!

What You Will Learn

  • Learn about machine learning landscapes along with the historical development and progress of deep learning
  • Load, interact, process, and save complex datasets
  • Solve classification and regression problems using state-of-the-art techniques
  • Train machines quickly to learn from data by exploring reinforcement learning techniques
  • Classify images using deep neural network schemes
  • Learn about deep machine intelligence and GPU computing
  • Explore active areas of deep learning research and applications

In Detail

The aim of the course is to help you tackle the common commercial machine learning and deep learning problems that you’re facing in your day-to-day activities.

This Learning Journey begins with an introduction to machine learning and deep learning. You will explore the main features and capabilities of TensorFlow such as computation graph, data model, programming model, and TensorBoard. The key highlight is the course will teach you how to upgrade our code from TensorFlow 0.x to TensorFlow 1.x. Next, you will learn the different techniques of machine learning such as clustering, linear regression, and logistic regression with the help of real-world projects and examples. You will also learn the concepts of reinforcement learning, the Q-learning algorithm, and the OpenAI Gym framework. Moving ahead you will dive into neural networks and see how convolution, recurrent, and deep neural networks work and the main operation types used in building them. Next, you will learn the advanced concepts such as GPU computing and multimedia programming. Finally, the course demonstrate an example on deep learning on Android using TensorFlow.

By the end of this course, you will have a solid knowledge of the all-new TensorFlow and be able to implement it efficiently in production.

Style and approach

This course takes a step-by-step approach to teach you how to implement TensorFlow in production. Starting with the basics of TensorFlow, you will learn machine learning and deep learning techniques, along with the advanced concepts of TensorFlow. With the help of real-world projects and examples, this course will help you apply Tensorflow’s features from scratch.

This course is a blend of text, videos, code examples, and assessments, all packaged up keeping your journey in mind. The curator of this course has combined some of the best that Packt has to offer in one complete package. It includes content from the following Packt products:

  • Building Machine Learning Systems with TensorFlow by Rodolfo Bonnin
  • Deep Learning with TensorFlow by Giancarlo Zaccone, Md. Rezaul Karim, and Ahmed Menshawy