1. 程式人生 > >學海無涯——大資料基本概念

學海無涯——大資料基本概念

公司舉辦了關於新技術的培訓,提到了“大資料”這個對我而言火了有幾年的概念。所以,有必要當好“小學生”了,不懂的字該怎麼辦,查字典。不懂的概念怎麼辦,學習唄!

---------------------------------------------------------------------------------------------------------------------------------------------------------------------

大資料(big data),指無法在一定時間範圍內用常規軟體工具進行捕捉、管理和處理的資料集合,是需要新處理模式才能具有更強的決策力、洞察發現力和流程優化能力的海量、高增長率和多樣化的資訊資產。 
在維克托·邁爾-舍恩伯格及肯尼斯·庫克耶編寫的《大資料時代》中大資料指不用隨機分析法(抽樣調查)這樣捷徑,而採用所有資料進行分析處理。大資料的5V特點(IBM提出):Volume(大量)、Velocity(高速)、Variety(多樣)、Value(低價值密度)、Veracity(真實性)

對於“大資料”(Big data)研究機構Gartner給出了這樣的定義。“大資料”是需要新處理模式才能具有更強的決策力、洞察發現力和流程優化能力來適應海量、高增長率和多樣化的資訊資產。
麥肯錫全球研究所給出的定義是:一種規模大到在獲取、儲存、管理、分析方面大大超出了傳統資料庫軟體工具能力範圍的資料集合,具有海量的資料規模、快速的資料流轉、多樣的資料型別和價值密度低四大特徵。
大資料技術的戰略意義不在於掌握龐大的資料資訊,而在於對這些含有意義的資料進行專業化處理。換而言之,如果把大資料比作一種產業,那麼這種產業實現盈利的關鍵,在於提高對資料的“加工能力”,通過“加工”實現資料的“增值”。 
從技術上看,大資料與雲端計算的關係就像一枚硬幣的正反面一樣密不可分。大資料必然無法用單臺的計算機進行處理,必須採用分散式架構。它的特色在於對海量資料進行分散式資料探勘。但它必須依託雲端計算的分散式處理、分散式資料庫和雲端儲存、虛擬化技術。  
隨著雲時代的來臨,大資料(Big data)也吸引了越來越多的關注。分析師團隊認為,大資料(Big data)通常用來形容一個公司創造的大量非結構化資料和半結構化資料,這些資料在下載到關係型資料庫用於分析時會花費過多時間和金錢。大資料分析常和雲端計算聯絡到一起,因為實時的大型資料集分析需要像MapReduce一樣的框架來向數十、數百或甚至數千的電腦分配工作。
大資料需要特殊的技術,以有效地處理大量的容忍經過時間內的資料。適用於大資料的技術,包括大規模並行處理(MPP)資料庫、資料探勘、分散式檔案系統、分散式資料庫、雲端計算平臺、網際網路和可擴充套件的儲存系統。

----------------------------------------------------------------------------------------------------------------------------------------------------------------------

Big data is data sets that are so big and complex that traditional data-processing application software are inadequate to deal with them. Big data challenges include

capturing data, data storage, data analysis, search, sharing, transfer, visualization, querying, updating, information privacy and data source. There are a number of concepts associated with big data: originally there were 3 concepts volume, variety, velocity. Other concepts later attributed with big data are veracity (i.e., how much noise is in the data) and value.

Lately, the term "big data" tends to refer to the use of predictive analytics, user behavior analytics, or certain other advanced data analytics methods that extract value from data, and seldom to a particular size of data set. "There is little doubt that the quantities of data now available are indeed large, but that’s not the ost relevant characteristic of this new data ecosystem." Analysis of data sets can find new correlations to "spot business trends, prevent diseases, combat crime and so on." Scientists, business executives, practitioners of medicine, advertising and governments alike regularly meet difficulties with large data-sets in areas including Internet search, fintech, urban informatics, and business informatics. Scientists encounter limitations in e-Science work, including meteorology, genomics, connectomics, complex physics simulations, biology and environmental research.

Data sets grow rapidly - in part because they are increasingly gathered by cheap and numerous information-sensing Internet of things devices such as mobile devices, aerial (remote sensing), software logs, cameras, microphones, radio-frequency identification (RFID) readers and wireless sensor networks. The world's technological per-capita capacity to store information has roughly doubled every 40 months since the 1980s;as of 2012, every day 2.5 exabytes (2.5×10^18) of data are generated. Based on an IDC report prediction, the global data volume will grow exponentially from 4.4 zettabytes to 44 zettabytes between 2013 and 2020.By 2025, IDC predicts there will be 163 zettabytes of data. One question for large enterprises is determining who should own big-data initiatives that affect the entire organization.

Relational database management systems and desktop statistics and software packages to visualize data often have difficulty handling big data. The work may require "massively parallel software running on tens, hundreds, or even thousands of servers". What counts as "big data" varies depending on the capabilities of the users and their tools, and expanding capabilities make big data a moving target. "For some organizations, facing hundreds of gigabytes of data for the first time may trigger a need to reconsider data management options. For others, it may take tens or hundreds of terabytes before data size becomes a significant consideration."