Glossary

Artificial Intelligence

It concerns the interactions between computers and human language, and how to program computers to process and analyse large amounts of natural language data, in the form of written text (unstructured or semi-structured, e.g. quality control report, comments/chat section of web applications) or human speech recognition (recorded or in real time). A major trend of NLP is building software to emulate and respond interactively as a human, in written text or speech.

 It relates to computer vision and image processing that deal with detecting and recognising instances of semantic objects of certain classes (e.g. humans, buildings, cars) in digital images and videos based on data fusion from different types of sensors (depth cameras, infra-red, laser, radar, etc.). Spatial recognition deals with locating objects in some spatial reference, such as an x, y, z coordinates or "bounding box".

It includes algorithms that typically train/build data models which can learn from a historical dataset and thus from experience represented in the data. ML algorithms are used in a wide variety of applications, such as computer vision, process simulation and predictive maintenance. Some categories of ML include supervised learning (known historical results/labels exist), unsupervised learning (no labels available for training) based on pattern matching to discover underlying structures, and deep learning, which uses artificial neural network algorithms with many layers.

ESs are used to emulate the decision-making or the diagnostic ability of human experts. They are designed to solve complex problems by reasoning through bodies of knowledge, represented mainly as if then-else rules instead of conventional computer code. Human expert rules are combined with data driven models and rules extracted directly from data. An ES is divided into two subsystems: the knowledge base representing facts and rules, and the inference engine applying the rules to deduce new facts.

 It solves new problems based on the solutions of similar past ones. CBR is an example of analogy solution making, commonly used by humans to solve problems. CBR has four key steps implemented as a computer application: Retrieve (obtain previous cases), Reuse (adapt previous case to new situation), Revise (test and further adapt as necessary), Retain (if new adapted case resulted effective), Store (for future use).

 It refers to a set of processes (typically asynchronous) that interact among themselves and with the environment in an intelligent manner to achieve some goal. Individually, they can be very simple or very complex, depending on the application. Multi-agent systems (i.e. hundreds or more IEs) can be used for the creation of Cyber-physical systems (CPSs) in which a mechanism is controlled or monitored by computer-based algorithms. A CPS is typically designed as a network of elements that interact via physical inputs and outputs, related to the fields of robotics and sensor networks.

Big Data

It is an interdisciplinary field of study whose object is the representation of data in graphical format. As a form of communication, it is particularly efficient when the amount of data to be represented is large, e.g. in the case of time series and/or Big Data. Key applications include dashboard displays for complex process real-time control systems, and management decision support systems.

 It deals with the collection and manipulation of items of data to produce meaningful information. It may involve various processes: validation (ensuring that supplied data is correct and relevant), sorting (arranging items in sequence and/or different sets), summarization (reducing detailed data to its main points), aggregation (combining multiple pieces of data), analysis/interpretation, reporting and classification (separation of data into various categories).

It consists of the relationship between the collection and dissemination of data, technology, public expectation of privacy, legal and political issues surrounding them. Also known as data privacy or information privacy, in industrial terms it can relate to cyber-security and installations to fight cyber-attacks, protect industrial secrets, patents and confidentiality.

It comprises all disciplines related to managing data as a valuable resource. It offers tools to facilitate the management of data and improve performance, consisting of an integrated, modular environment to manage enterprise application data, and optimize data-driven applications over its lifetime. It includes the following objectives: produce enterprise-ready applications faster; improve data access, speed iterative testing; automate and simplify operations; support business growth.

It relates to hardware and services upon which building AI/BD, and includes: file and disk storage service such as file servers, file backup, long-term archive and FTP services; networks; authentication; service access authorisation; managed platform for the virtual hosting of Windows, Linux and Unix services on reliable virtualisation platforms; cloud computing services providing platforms for self-provision server infrastructure to support both employees and clients.

 Creating awareness of technology, market, and business trends to adapt products/services to future demand and customer needs by obtaining strategic information and data from internal and external sources. Open innovation taps into knowledge and assets available within and beyond a single company, along with any relevant data. Sub-processes: Sales, Customer Relationship Management, Consumer Behaviour Analysis, Market Scenario Analysis, Demand Management and Forecasting.

Basic activity of conceptualizing, creating, and evolving products’ features to solve customers’/users’ problems or address specific needs in a given market. Product customization is closely related to design in alignment with particular customers desires, increasing customer perceived value of a product. Sub-processes: Product/Service Design and Customization, New Product/Service Introduction, Design.

Series of actions and techniques applied to detect possible failures and defects of machinery in the early stages, prevent major failures and future stoppages. The objective is to maintain a certain level of service in the given process industry, which requires the capture of a lot of data from sensors of the machines and information from periodic reports and planned maintenance.

Planning, executing, and controlling the operations of the supply network with the purpose of effectively meeting customer needs. Complex requirements, deadlines, and restrictions often conflict/overlap; hence data models and intelligent planning can help finding optimum configurations that balance different prioritized requirements and commitments. Sub-processes: Procurement, Production, Storage, Distribution, Network Design, Logistics Systems, Supplier Relationship Management, Contract Management, Sourcing, Scheduling.

Discipline of adjusting a process to maintain or optimize a specified set of parameters without violating process constraints. Digital twins are a solution for modelling and simulating complex processes, thus avoiding expensive trial and error calibration, for example. Sub-processes: Process and Equipment Monitoring, Quality control and monitoring, Process Redesign.

 Functional process including all the activities that guarantee effective allocation of resources (human, physical, financial) for the introduction of new products/services or the improvement of existing ones. Many digital technologies play a role here, e.g. data management, intelligent planning, data visualization, cyber-physical systems, data understanding and characterization, natural language processing, etc. Sub-processes: Scenario Based Analysis, Optimization/Simulation, HR Management, Risk Management, Collaborative/Joint Innovation Platform Development, Process Redesign, Development, Testing, and Piloting.