Big Data & Data Management
Accor Hotels Germany, SHS VIVEON
Prof. Ziegler and his team’s area of expertise is psychological diagnostics. He successfully collaborated with renowned companies on projects, which focused on the development of competence models, employee surveys, annual performance reviews and occupational aptitude testing. Prof. Ziegler and his team also train individuals or teams on topics of personnel assessment, e.g. assessment centres and interviews. Moreover, Prof. Ziegler develops customised diagnostic processes and evaluates them.
PC-laboratory with 10 workstations as well as several performance and personality tests.
- Accor Hotels Germany: development of competence models
- SHS VIVEON: realisation of training for interviewers
- International executive headhunting company: development of customised personality test
Prof. Winter’s research is focused on animal behaviour, especially cognitive performance, e.g. learning ability, memory and their neurobiological foundation for foraging. He carries out behavioural assessments for mice and rats, mostly for preclinical research, to investigate decision-making processes in animals. Lab space is available, inter alia at the Berlin Mouse Clinic for Neurology and Psychiatry (BMC), one of the leading diagnostic institutions within the NeuroCure Cluster of Excellence. Prof. Winter and his team also advise on experimental design and develop automated equipment and systems for behavioural studies on rodents for external laboratories. Another area of his expertise is virtual reality for humans. Here Prof. Winter focuses on the development of a compact, mobile, and immersive virtual reality environment for a wide range of uses.
- Computer controlled sensor and actuator systems for manipulation and measuring of behaviour
- Modular software system for automated measuring of behaviour
- RFID technology
- Immersive 3D-virtual reality system with closed monitoring (octagon)
- Software for rapid implementation of specific VR-applications and interaction scenarios
- Planning and development of a system for automated diagnostics of parameters for motor skills after clinical surgery for a large international pharmaceutical group. 24h recording and measurement as operator independent system.
Prof. Weidlich is a professor at the Department of Computer Science, where he leads the Process-Driven Architectures research group. The goal of his team is to support and improve the design and the analysis of process-oriented information systems (POIS) and event-driven systems (ES). POIS are software systems that help to automate, monitor, and control processes. These systems are established in various domains, from logistics, through healthcare, to infrastructure monitoring. The team’s research focuses on formal methods for behavioural modelling and verification, event-driven approaches to monitor and control processes, as well as questions related to process-driven data integration. The group further works on techniques that optimise the run-time behaviour of event-driven systems. In particular, the evaluation of event queries over streams can be designed more efficiently once regularities within event streams have been detected. In 2016, Prof. Weidlich was named Junior Fellow of the German Informatics Society (Gesellschaft für Informatik, GI) and awarded with the Berlin Research Prize (Young Scientist) by the Governing Mayor of Berlin.
- extensive experience with the documentation of business processes and in training for process management techniques
- expertise in process mining, data-driven analysis of processes in terms of qualitative (compliance requirements) and quantitative (bottleneck-analysis, management of resource deployment) properties
- know-how in scalable infrastructures for event stream processing
- for a leading US cancer clinic: analysis and improvement of clinical processes based on the data of a real-time-locating-system
- for an international oil and gas group: development of techniques for detecting irregularities in streams of sensor data
- for a well-known German manufacturer of enterprise software: design and development of add-ons for a business process modelling platform
With his Computer Engineering research group, Prof. Scheuermann develops technologies for application-specific communication and computer systems that are efficient, secure, and reliable. The topics range from tailored communication protocols and application-specific digital circuits all the way to different aspects of online anonymity and privacy-preserving technologies. For instance, Prof. Scheuermann aims to interconnect automobiles to make best possible use of the available road network resources. Furthermore, he develops specialised circuitry for firewalls, so that within nanoseconds the firewall can decide what communication should be permitted. The research team models flow control and congestion control mechanisms for Internet anonymity systems, which for example allow political activists in totalitarian regimes to bypass Internet censorship. They design privacy compliant data collection algorithms, which grant detailed Internet traffic statistics without revealing information about individual users. His team’s expertise encompasses the set-up of wireless communication in between manufacturing machines in factories, so as to optimize the production process. In the field of warehouse logistics, Prof. Scheuermann combines measurements from various types of sensors to allow for a more accurate positioning of goods. In all fields of operation, tailored solutions for communication protocols and circuitry are necessary. The challenges are rooted in the particular application area: for instance, very fast or particularly reliable communication may be needed. Or data exchange is limited due to inherent technical limitations, yet the respective application must function reliably. It could be the case that IT security requirements need to be taken into account in new application fields where standard solutions fail. There may be a trade-off between data protection requirements or user privacy concerns, and communication requirements for data transmission. In all mentioned examples, it is necessary to look beyond common solution strategies and to keep the whole picture in mind. This systems perspective characterises the Computer Engineering group and its projects.
- extensive experience in analytical, simulative, and experimental Analysis of network protocols and digital circuits
- well-equipped network laboratory, which allows for realistic set-ups of scenarios and network topologies (for experimental assessments of wired and wireless communication protocols)
- workshops and laboratories for the design and evaluation of application-specific digital circuits, in particular of FPGA-based systems
- for a large German car manufacturer: the group developed application-specific communication protocols for the data exchange between automobiles, including analytical and simulative assessment
- for afinancial service provider: the group assessed the security of the IT and communication infrastructure
- together with an IT security solution provider: the group developed tailored processors for hardware-supported firewalls
- with a young start-up company: the group developes a secure system and communication architecture for highly distributed Smart City applications
The research of Prof. Leser and his group is focused on all aspects of management, integration and analysis of heterogeneous, large and distributed data sets including natural language texts (text-mining and information extraction). This encompasses subjects such as data warehouses and ETL, graph databases, deep web and semantic web, machine learning, similarity search, scientific workflows, statistical methods of data analysis as well as methods for assessing and securing data quality. The team of Professor Leser conducts research in a variety of interdisciplinary projects, especially with colleagues from the life sciences covering the range from basic molecular biology to Systems Medicine.
Very good IT-facilities:
- several state-of-the-art parallel computer cluster (20-80 CPUs, 1 TB main memory)
- cluster with 60+ cores
- 50TB+ storage
- For an IT service provider: Consultancy and prototype development in the field of master data standardisation and integration
- For an IT-manufacturer: Development and valuation of algorithms for analysing data quality
- For an international pharmaceutical company: Consultancy and development of text-mining-procedures in biomarker development
- For a medium-sized biotech company: Joint system development (partly funded by the German Federal Ministry for Economic Affairs and Energy) in the field of human genotype changes evaluation
Prof. Kuemmerle’s team strives to better understand how land use change affects the environment and society. This includes assessments of spatial patterns and the underlying drivers of, for example, tropical deforestation, agricultural abandonment, or intensification of farming. Furthermore, the team studies how changes in land use affect the ecosystem services (e.g. food production or carbon storage) and biodiversity. As one focus the work group analyses trade-offs between use of resources and nature conservation, as well as how such trade-offs could be mitigated. The group also seeks to generate insights and records of high practical value for land use and nature conservation planning, and to assess the effectiveness of nature conservation measures. Prof. Kuemmerle’s regional focus is on Europe, the former Soviet Union and Latin America.
- the working group offers quantitative, spatially-explicit methods drawing from spatial ecology, geoinformatics, remote sensing, spatial statistics, econometrics, conservation planning, and wildlife biology
- equipment to carry out broad-scale surveys to gather primary data
- server and data storage infrastructure that allows for computationally demanding and/or big data projects
- different projects with environmental organisations
alta4 Geoinformatics AG
Prof. Hostert’s explores cutting-edge satellite data analysis. His main focus lies on questions regarding the global change, particularly large-scale mapping in agrarian- and forestry systems and near-nature ecological systems worldwide. He analyses the change of the earth’s surface through different methods, for example with machine learning, big data, time series analyses, hyperspectral and multisensor approaches, as well as multiscale analyses. Regional expertise of the team covers Germany, the mediterranean areas and South America, as well as Central Asia.
- satellite data analysis
- AI in remote sensing
- large-scale remote sensing analysis with big data approaches (particularly Sentinel-2, Landsat), funded through projects of the BMWi, BMBF, BMEL, as well as the EU
- scientific monitoring of satellite missions (Landsat Science Team, EnMAP ScientificAdvisory Group)
- satellite based mapping and land use analysis for NGOs, authorities and global tech- and logistics companies
Professor Hafner‘s research in Adaptive Systems is concerned with extracting principles of intelligence from biological systems and transferring them to artificial systems. We focus on the transfer of cognitive skills to autonomous robots. The challenge not only lies in building intelligent autonomous robots, but also in gaining insights into biological systems through robot experiments. Our main research themes are sensorimotor learning, internal models for prediction, attentional processes, and spatial cognition. The methodological approaches cover evolutionary algorithms, neural learning, and information theory. We use various types of mobile robots as research platforms, e.g. humanoid, mobile, flying and underwater robots, as well as software simulations. Professor Hafner is IEEE Senior Member and Principal Investigator in several projects funded by the EU.
- Local company for automation and robotics: Student semester project for the development of a collaborative fleet management system for autonomous transport robots.
The Logic in Computer Science group focuses on theoretical computer science, with emphasis on logic, database theory, and complexity theory. Particular attention is paid to the relations between these areas. For example, logics serve as a basis for database query languages and specification languages used for automatic verification, and many aspects of complex systems can be modeled by logical structures. Thus, properties of complex systems can be specified by logical formulas. The overall aim of our group is to gain better understanding of the complexity inherent in a problem or a system. Here, we are interested in various measures of complexity, including the computational complexity ("How difficult is it to algorithmically solve the problem?") and the descriptive complexity ("How difficult is it to describe the problem in a suitable formalism?"). Particular attention is paid to the connection between logical description of problems and efficient algorithmic solutions.
Prof. Haerdle’s main research interests are quantitative finance, esp. multivariate methods in banking and finance, dimension reduction techniques, and computational statistics. In his roles both as coordinator of the Collaborative Research Center “Economic Risk” (CRC 649) and director of the interdisciplinary Center for Applied Statistics and Economics (C.A.S.E.) he primarily investigates economic risks on a global scale. Prof. Haerdle’s research aims at facilitating the evaluation of such risks and to reduce uncertainty to improve economic actors’ decision-making.
Prof. Haerdle is Distinguished Visiting Professor Wang Yanan Institute for Studies in Economics (WISE) at Xiamen University, China, as well as director of the International Research Training Group “High Dimensional Non Stationary Time Series” (ITRG 1792). Among other distinctions he received the “Econometric Theory Multa Scripsit Award” in 2012.
- multivariate statistical analysis (factor analysis, cluster Analysis, etc.)
- portfolio optimisation
- risk management
- pricing derivatives
- functional data analysis
- non- and semi-parametric methods
- data visualisation
- Ongoing cooperation with and lecturing for leading international financial institutions
- Center for Applied Statistics and Economics (C.A.S.E.): interdisciplinary research centre with the goal to analyze and solve current complex economic problems and those arising in related fields with the help of quantitative methods and computing. Its research subjects range from weather risk, aging societies, crime to property markets
- Collaborative Research Center “Economic Risk” (CRC 649): center of transdisciplinary research where insights from economics, mathematics and statistics converge to analyze economic risks and risk factors. The CRC offers an international platform for discussion of the latest research results and collaborations
At the Chair of Software Engineering, Prof. Grunske and his team specialise in methods of software technology relevant to the field of automated development and quality control of software systems. His work also involves probabilistic techniques on the basis of which the probable and less probable behaviour of a program can be modelled. This allows for easier discovery and correction of software anomalies. Such statistic models are used in the monitoring and debugging of programs during runtime as well as in testing software, which supports the development of safe and reliable software systems. Furthermore, Prof. Grunske develops methods that enable a precise definition of the quality requirements of software systems, the formalisation of verification conditions as well as the (technical) safety in embedded systems and process and performance management.
- Software engineering
- Testing and verification
- Statistics/probabilistic methods
- Formalisation of application scenarios in cooperation with TWT GmbH: “Safe.Spec: Quality control of behaviour requirements”
- Using software systems to derive probabilistic models that can be used as specification during the software engineering process: „EMPRESS: Extracting and Mining of Probabilistic Event Structures from Software Systems”
- Development of evaluation methods for probabilistic models as well as machine learning based techniques for the transformation of models: “ENSURE-II: ENsurance of Software evolution by Run-time cErtification”
Prof. Freytag holds the chair of Databases and Information Systems (DBIS). His research interests include all aspects of processing and query optimisation in (object-)relational database systems, developments related to databases (such as semi-structured or graph based data), data quality, big data analyses as well as privacy support in database and information systems. Furthermore, Prof. Freytag is involved in many cooperations using database technology for applications such as geoinformation systems (GIS), bioinformatics, physics and life sciences. In the past, he received the IBM Faculty Award four times for collaborative work concerning databases, middleware, and bioinformatics/life sciences. In 2009 and 2010, Prof. Freytag won the HP Labs Innovation Research Award for his research in the field of databases and cloud computing. He was one of the organisers of the VLDB (Very Large Data Bases) conference in Berlin in 2003, the most important international database conference. From 2001 to 2007, he was a member of the VLDB foundation (VLDB Endowment Inc.). Since 2009, Prof. Freytag has been the spokesperson of the department DBIS of the German Informatics Society (GI).
- Large IBM Server Linux/AIX with DBMS IBM DB2
- Computer cluster with 128 cores
- 30TB storage capacity
- Renowned American IT/DBMS manufacturer: improving existing database management systems (DBMSs) in the area of query optimisation; extending existing ETL tools
- Renowned American IT/DBMS manufacturer: extending DBMS functionality; designing and prototyping performance improvements in query processing; suggestions for future extension of the DBMS products
- Well known German software manufacturer: continuous consulting in the area of database systems, spe-cifically, query processing over several years to improve performance and functionality
- Well known German company: design and implementation of a query processing optimiser for the Lighweight Directory Access Protocol (LDAP) product of this company
- Consulting for various SMEs in Germany in the area of data modeling and process modeling using a state-of-the-art DBMS technology; using DBMS technology within their own products; strategic consulting for a long term use of DBMS technology