Prof. Weidlich is a professor at the Department of Computer Science, where he leads the Process-Driven Architectures research group. The goal of his team is to support and improve the design and the analysis of process-oriented information systems (POIS) and event-driven systems (ES). POIS are software systems that help to automate, monitor, and control processes. These systems are established in various domains, from logistics, through healthcare, to infrastructure monitoring. The team’s research focuses on formal methods for behavioural modelling and verification, event-driven approaches to monitor and control processes, as well as questions related to process-driven data integration. The group further works on techniques that optimise the run-time behaviour of event-driven systems. In particular, the evaluation of event queries over streams can be designed more efficiently once regularities within event streams have been detected. In 2016, Prof. Weidlich was named Junior Fellow of the German Informatics Society (Gesellschaft für Informatik, GI) and awarded with the Berlin Research Prize (Young Scientist) by the Governing Mayor of Berlin.
- extensive experience with the documentation of business processes and in training for process management techniques
- expertise in process mining, data-driven analysis of processes in terms of qualitative (compliance requirements) and quantitative (bottleneck-analysis, management of resource deployment) properties
- know-how in scalable infrastructures for event stream processing
- for a leading US cancer clinic: analysis and improvement of clinical processes based on the data of a real-time-locating-system
- for an international oil and gas group: development of techniques for detecting irregularities in streams of sensor data
- for a well-known German manufacturer of enterprise software: design and development of add-ons for a business process modelling platform
Our daily life more and more depends on computational systems embedded in common appliances. Just think of advanced driver assistance systems in cars, medical devices, or indsutrial supervisory control and data acquisition systems. Since such systems also realise safety-critical tasks, it is all the more important to provide effective and efficient quality assurance for them. The specification, verification and testing theory group researches methods for model-based development and model checking, logical verification, and automated testing of safety-critical software. Prog. Schlingloff is chief scientist of the system quality center at Fraunhofer FOKUS, Berlin, and chairman of the boards of GFaI e.V. and ZeSys e.V.
- Major German company for communication and sensors: Student semester project for the design and implementation of a system for distributed control of indoor air quality.
The research of Prof. Leser and his group is focused on all aspects of management, integration and analysis of heterogeneous, large and distributed data sets including natural language texts (text-mining and information extraction). This encompasses subjects such as data warehouses and ETL, graph databases, deep web and semantic web, machine learning, similarity search, scientific workflows, statistical methods of data analysis as well as methods for assessing and securing data quality. The team of Professor Leser conducts research in a variety of interdisciplinary projects, especially with colleagues from the life sciences covering the range from basic molecular biology to Systems Medicine.
Very good IT-facilities:
- several state-of-the-art parallel computer cluster (20-80 CPUs, 1 TB main memory)
- cluster with 60+ cores
- 50TB+ storage
- For an IT service provider: Consultancy and prototype development in the field of master data standardisation and integration
- For an IT-manufacturer: Development and valuation of algorithms for analysing data quality
- For an international pharmaceutical company: Consultancy and development of text-mining-procedures in biomarker development
- For a medium-sized biotech company: Joint system development (partly funded by the German Federal Ministry for Economic Affairs and Energy) in the field of human genotype changes evaluation
The structure research and electron microscopy group at the HU of Berlin actively develops and applies a broad spectrum of techniques for the analysis of small volumes of material by transmission electron microscopy (TEM), scanning transmission electron microscopy (STEM), scanning electron microscopy (SEM), and also light microscopy. Know-how which we can offer to external partners is based on the development of radically new electron diffraction techniques for ab-initio determination of atomic structure in (nano-) crystalline materials, and also the development of inline holographic techniques for the TEM as well as the optical microscope. The inline holographic techniques developed within the group are applied in the TEM for mapping electrostatic potentials, charge density distributions, and also strain in semiconductors and ceramics. In the optical microscope we apply these techniques for quantitative phase microscopy of living cells or mapping the topography of surfaces with nm resolution along the optical axis.
Transmission electron microscopes:
- JEOL JEM2200FS FEG-(S)TEM equipped with an in-column energy filter for electron energy-loss spectroscopy, HAADF- and BF-STEM detectors, EDX spectrometer for local elemental analysis, NanoMegas ASTAR precession unit for phase mapping and electron crystallography, electrostatic biprism for off-axis holography, Instrument control software for Large-Angle Rocking-Beam Electron Diffraction (LARBED)
- Hitachi H-8110 (S)TEM with LaB6 cathode, equipped with HAADF- and BF-STEM detectors
Scanning electron microscope:
- Zeiss GeminiSEM 500 with a field-emission source, equipped with a multi-segment STEM detector, an EDX-Spectrometer, and an E-beam lithography unit (RAITH-Elphy)
- Within a cooperation agreement with a leading provider of optical products, Prof. Koch has developed novel techniques for analysing series of images.
Model-driven engineering raises the level of abstraction in software engineering by using models as primary development artifacts. In particular, domain-specific modelling languages can ease the transition between informally sketched requirements or designs and implementations by supporting high-level yet formal representations as a starting point for automation. Moreover, using a model-based development approach, critical system properties can be analyzed, validated and verified even before the system is actually built. Model-driven development thus leads to an increase in both productivity and quality. To some extent, model-driven engineering has made its way into industrial practice, most notably for the development of embedded systems in various domains. However, model-driven engineering does not suffice to successfully manage all challenges of modern software engineering, and actually creates new problems. Research conducted at the Chair of Model-driven Software Engineering is particularly driven by relevant challenges and problems arising from the adoption of the model-driven engineering paradigm in industrial practice.
- Experience in implementing model-based development engineering methods, techniques and processes
- Know-how regarding the set-up of model-based transformation chains (domain-specific modeling languages, model transformation and interpretation, code generation) and development environments (collaborative modeling, (co-)evolution of models, model repair and synchronization)
- Expertise in the field of version and variant management, especially customized configuration management and software product lines
- Collaboration with a Berlin-based software company on the development of innovative software architecture analysis techniques for the quality assurance of embedded systems
- Consulting for a major German automotive supplier with regard to fundamental questions of configuration management of models for the model-driven development of embedded systems
- Support of an international electrical engineering corporation with the model-based development of software components for a new generation of internet-based multimedia building communication systems
The chair of Visual Computing develops new methods for the analysis and synthesis of image and video data. This includes algorithms for estimating shape, material, motion and deformation from monocular and multi-view camera systems. Both in national and international collaborations, those algorithms are exploited in applications like multimedia, VR/AR, industry, medicine, and security.
- Various cameras
- Multispectral sensors, 3D sensors
- Lighting and calibration systems
- Development of new methods for automatic inspection and damage classification of sewer networks with a water supply company
- Development of augmented reality systems for automobile production processes with a car manufacturer
- Analysis of multispectral imaging for tissue classification in collaboration with medical technology manufacturer
G.F. Schreinzer Positronik, Steinbeis GmbH & Co. KG, Steinbeis GmbH & Co. KG, Pronova Analysentechnik GmbH & Co. KG, newtec Umwelttechnik GmbH
Biosystems engineering works at the interface between engineering and biological production processes. Prof. Schmidt and his team develop engineering solutions for a sustainable agricultural production of crops and other environmental friendly technologies. Prof. Schmidt’s research thus leads to innovative plant farming methods in greenhouses, outdoors and other intensive crop farming systems. Alternative energy supply systems (low energy greenhouses) and closed material cycles for intensive crop farming (water hygiene, sensor systems and algorithms for fully automated nutrient solution supply in closed cycles) are Prof. Schmidt’s research area. His main activity herein is the development of sensors for gas analyses, climate measurement technology and that of software supporting decision making in automation systems. Moreover, the team also provides energetic assessments in complete production systems and parts thereof as well as process analyses.
- Experimental greenhouses with energy and material flow analytics, CO2 enrichment, artificial lights and fog systems
- Plant monitors for continuous measurement of photosynthesis, transpiration, tissue temperature, stomatal conductance, climate measurement, gas analyses (Co2, ethylene), soil moisture sensors
- Freely programmable automation system for climate and process control in greenhouses
- G.F. Schreinzer Positronik, Steinbeis GmbH & Co. KG: Development of an automation system for greenhouses based on measurement details of plants (Phytocontrol)
- Steinbeis GmbH & Co. KG: National collaborative research project „The Low Energy Greenhouse“ („Zukunftsinitiative Niederigenergiegewächshaus“, ZINEG)
- Pronova Analysentechnik GmbH & Co. KG: Development of ionselective sensors for continuous recording of ion proportion in circulating nutrient solution systems; Development of measuring device to analyse phytometric reactions in plants
- newtec Umwelttechnik GmbH: Development of re-circulating irrigation system with reduced phytosanitary risk in greenhouses
At the Chair of Software Engineering, Prof. Grunske and his team specialise in methods of software technology relevant to the field of automated development and quality control of software systems. His work also involves probabilistic techniques on the basis of which the probable and less probable behaviour of a program can be modelled. This allows for easier discovery and correction of software anomalies. Such statistic models are used in the monitoring and debugging of programs during runtime as well as in testing software, which supports the development of safe and reliable software systems. Furthermore, Prof. Grunske develops methods that enable a precise definition of the quality requirements of software systems, the formalisation of verification conditions as well as the (technical) safety in embedded systems and process and performance management.
- Software engineering
- Testing and verification
- Statistics/probabilistic methods
- Formalisation of application scenarios in cooperation with TWT GmbH: “Safe.Spec: Quality control of behaviour requirements”
- Using software systems to derive probabilistic models that can be used as specification during the software engineering process: „EMPRESS: Extracting and Mining of Probabilistic Event Structures from Software Systems”
- Development of evaluation methods for probabilistic models as well as machine learning based techniques for the transformation of models: “ENSURE-II: ENsurance of Software evolution by Run-time cErtification”
Prof. Freytag holds the chair of Databases and Information Systems (DBIS). His research interests include all aspects of processing and query optimisation in (object-)relational database systems, developments related to databases (such as semi-structured or graph based data), data quality, big data analyses as well as privacy support in database and information systems. Furthermore, Prof. Freytag is involved in many cooperations using database technology for applications such as geoinformation systems (GIS), bioinformatics, physics and life sciences. In the past, he received the IBM Faculty Award four times for collaborative work concerning databases, middleware, and bioinformatics/life sciences. In 2009 and 2010, Prof. Freytag won the HP Labs Innovation Research Award for his research in the field of databases and cloud computing. He was one of the organisers of the VLDB (Very Large Data Bases) conference in Berlin in 2003, the most important international database conference. From 2001 to 2007, he was a member of the VLDB foundation (VLDB Endowment Inc.). Since 2009, Prof. Freytag has been the spokesperson of the department DBIS of the German Informatics Society (GI).
- Large IBM Server Linux/AIX with DBMS IBM DB2
- Computer cluster with 128 cores
- 30TB storage capacity
- Renowned American IT/DBMS manufacturer: improving existing database management systems (DBMSs) in the area of query optimisation; extending existing ETL tools
- Renowned American IT/DBMS manufacturer: extending DBMS functionality; designing and prototyping performance improvements in query processing; suggestions for future extension of the DBMS products
- Well known German software manufacturer: continuous consulting in the area of database systems, spe-cifically, query processing over several years to improve performance and functionality
- Well known German company: design and implementation of a query processing optimiser for the Lighweight Directory Access Protocol (LDAP) product of this company
- Consulting for various SMEs in Germany in the area of data modeling and process modeling using a state-of-the-art DBMS technology; using DBMS technology within their own products; strategic consulting for a long term use of DBMS technology