In a typical analytics life cycle, the first step involves developing an analytics leveraging a number of interactive analysis tools such as Zeppelin, and Jupyter, and eventually publishing the analytics to a catalog for operationalizing. This involves configuring and running in a computing environment such as Apache Spark at cloud scale, based on the volume of data. The configuration of the analytics, for example, includes the asset instance (also known as an equipment running in site) of the asset classification (also known as an asset type). We will discuss further the development of industrial analytics using the Spark Framework in later sections of the chapter.
In the following diagram, we see the steps of developing ...