Big data needs new processing modes to own stronger decision-making power, insight discovery, the large volume and high growth of process optimization ability, and the diversifed information assets. As the information technology of a new generation based on Internet of Tings, cloud computing, and mobile internet, big data realizes the record and collection of all data produced in the whole life cycle of the existence and evolutionary process of things. It starts from the angle of completely expressing a thing and a system to express the coupling relationship between things. When the data of panorama and whole life cycle is big enough and the system component structure and the static data and dynamic data of each individual are recorded, the big data can integrally depict the complicated system and the emerging phenomena. Viktor Mayer-Schonberger proposed the transformation ¨ of three thoughts in the big data era: it is not random samples but the whole data; it is not accuracy but complexity; and it is not causality but correlativity. “Te whole data” refers to the transformation from local to overall thought, taking all data (big data) as analysis objects. “Complexity” means to accept the complexity and inaccuracy of data. Te transformation from causality to correlativity emphasizes more on correlation to make data itself reveal the rules. It is closely related to the understanding of things by complex scientifc thinking, which is also the integral thinking, relational thinking, and dynamic thinking. Te analysis technology of big data is the key to exploring the hidden value in the big data. Te traditional scientifc analysis method records the samples of the thing statuses, which is a method of small data, and perceives things based on small sample data, mathematical induction, and logical induction. But such a method cannot efectively solve complexity problems. In the big data era, the quantitative data description of complex huge system is no longer the mere experimental sample data but the full scene data of the overall state. Terefore, data analysis should adopt complex scientifc intelligent analysis method for modeling and simulating, utilize and constantly optimize big data for machine learning, and analyze and study the self-organizing and evolving rules of complex systems.

Complexity Problems Handled by Big Data Technology / Lv, Zhihan*; Ota, Kaoru; Lloret, Jaime; Xiang, Wei; Bellavista, Paolo. - In: COMPLEXITY. - ISSN 1076-2787. - ELETTRONICO. - 2019:(2019), pp. 9090528.1-9090528.7. [10.1155/2019/9090528]

Complexity Problems Handled by Big Data Technology

Bellavista, Paolo
2019

Abstract

Big data needs new processing modes to own stronger decision-making power, insight discovery, the large volume and high growth of process optimization ability, and the diversifed information assets. As the information technology of a new generation based on Internet of Tings, cloud computing, and mobile internet, big data realizes the record and collection of all data produced in the whole life cycle of the existence and evolutionary process of things. It starts from the angle of completely expressing a thing and a system to express the coupling relationship between things. When the data of panorama and whole life cycle is big enough and the system component structure and the static data and dynamic data of each individual are recorded, the big data can integrally depict the complicated system and the emerging phenomena. Viktor Mayer-Schonberger proposed the transformation ¨ of three thoughts in the big data era: it is not random samples but the whole data; it is not accuracy but complexity; and it is not causality but correlativity. “Te whole data” refers to the transformation from local to overall thought, taking all data (big data) as analysis objects. “Complexity” means to accept the complexity and inaccuracy of data. Te transformation from causality to correlativity emphasizes more on correlation to make data itself reveal the rules. It is closely related to the understanding of things by complex scientifc thinking, which is also the integral thinking, relational thinking, and dynamic thinking. Te analysis technology of big data is the key to exploring the hidden value in the big data. Te traditional scientifc analysis method records the samples of the thing statuses, which is a method of small data, and perceives things based on small sample data, mathematical induction, and logical induction. But such a method cannot efectively solve complexity problems. In the big data era, the quantitative data description of complex huge system is no longer the mere experimental sample data but the full scene data of the overall state. Terefore, data analysis should adopt complex scientifc intelligent analysis method for modeling and simulating, utilize and constantly optimize big data for machine learning, and analyze and study the self-organizing and evolving rules of complex systems.
2019
Complexity Problems Handled by Big Data Technology / Lv, Zhihan*; Ota, Kaoru; Lloret, Jaime; Xiang, Wei; Bellavista, Paolo. - In: COMPLEXITY. - ISSN 1076-2787. - ELETTRONICO. - 2019:(2019), pp. 9090528.1-9090528.7. [10.1155/2019/9090528]
Lv, Zhihan*; Ota, Kaoru; Lloret, Jaime; Xiang, Wei; Bellavista, Paolo
File in questo prodotto:
File Dimensione Formato  
9090528.pdf

accesso aperto

Tipo: Versione (PDF) editoriale
Licenza: Licenza per Accesso Aperto. Creative Commons Attribuzione (CCBY)
Dimensione 1.62 MB
Formato Adobe PDF
1.62 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/665046
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 2
social impact