The Digital Centre's Work on Big Data
Although it is a rather nebulous term, "Big Data" typically means one of two things: big data in a statistical sense, meaning "Big N" and implying the use of all N data points available, and not just a sub sampling; or "Big" as in "massive data size", implying a stretch of the storage capacity beyond normally available levels - and this is obviously a time dependent meaning as storage capacity is becoming more affordable and plentiful. We have interests in both meanings as part of our Big Data research work within The Digital Centre.
We are developing software embodying algorithms for processing very large data sets in a Big N statistical sense using: graph algorithms drawn from our work on network science; deep learning and artificial intelligence techniques to identify and mine data patterns; and immersive augmented and virtual reality techniques for visualising complex data sets.
Alongwith colleagues and collaborators within the University of Hull and in local industry, we are considering how a state of teh art massive capacity data centre might look and function. The University operates a state of the art supercomputer - known locally as "Viper" and having 5,500 compute cores (at time of writing). We are presently considering what a separate "big data" supercomputer that is even more focussed upon storage and on big data project work, might look like and how it might be housed in a new custom designed data centre building, so as to best leverage: data protection and cybersecurity issues; processing scalability; storage management; analytic processing; and visual analytic capabilities.
Some of our Big Data projects include:
There are opportunities to work with us in the area of Big Data through: contract research and development; collaborative grant funded work; student work placements or student research internships or postgraduate study.
You can contact us via Prof Ken Hawick, Director.