Dr. Stein has a lot of experience with the conception and development of serious games and gamification for knowledge transfer, especially for museums, cultural institutions and in medicine. As a Germanist and computer scientist, he masters both the conceptual and cultural perspective as well as the technical of implementation. He has led over 15 software development projects in the areas of App, VR, AR and Browser Apps. When it comes to imparting knowledge and gaining experiences, games are the medium of our time, according to Dr. Stein. As a researcher and game developer, he is particularly interested in exciting mediation challenges.
Dr. Stein develops concepts for knowledge transfer games, Serious Games and Gamification. He is also involved in software development, implementation and testing. He also offers workshops on VR and games as a cultural technique.
The following software projects were all developed iteratively and agilely in an academic context according to Dr. Stein's conception. In addition, they were tested and published under his leadership and represent a significant part of his publications.
- Mein Objekt
is an innovative museum game that enables visitors to engage in dialogue with objects, seeking to transform digital habits into cultural interest. Using machine learning and adaptive language, an adaptation of the application to the individual visitor is ensured, resulting in a personalized, interest-driven museum tour. The project was developed for the Humboldt Forum as part of museum4punkt0 and will be used there for the first time when it opens. It is developed in React Native and is used BYOD for Android and iOS.
Diary is a tool for exploring scientific interdisciplinary work and has been developed for the Experimantalzone of the Cluster of Excellence Image Knowledge Gestaltung to better understand interdisciplinary collaboration. It is developed for macOS in Objective-C and can record, visualize and synchronize activities such as software usage, keystrokes, activity cycles and more with a server. It is used to explore digital working practices and habits across disciplines and also benefits users in terms of self-reflection on their working practices.
- iWrite is a gamified tool to improve scientific writing processes. It is developed for macOS in Objective-C and allows the user to organize writing processes into focused sessions, eliminate distractions, visualize progress, and schedule writing sessions. It also unlocks step-by-step tips and videos from writing experts.
- BWG VR is an interactive, stereoscopic 360° experience for virtual reality headsets that shows the Cluster of Excellence Image Knowledge Gestaltung in its various spaces and presents actors of the cluster exemplarily in their working environments. The connection between space, objects and people is presented and the complex configuration of different actors is made tangible.
- Singleton is a game for integrating personal goals into everyday life and individual development based on self-imposed priorities based on theoretical principles from psychology and game studies. It is developed in Unity Mobile for Android and iOS and allows entertaining and goal-oriented microinterventions in a level system based on individually generated card decks. It is based on the Big 5 character traits and dynamically assigns specific card packs individually to the user.
- ID+Lab is the prototypical front-end of a publication platform designed to publish interdisciplinary research as a network and to show connections that otherwise remain hidden. It is developed as a web application and is based on Semantic Web technologies with GraphDB in the background. Publications are modeled here as a semantic network and visualized dynamically and intuitively. A DFG research proposal to deepen the project is currently under review.
- Reading Revisited is a VR application that tests reading in virtual reality and dynamically integrates various texts into moving landscapes. It explores the question of what influence reading environments have on comprehension and memorization and whether text can be thought of as a combination of environment and reading. Texts from the cooperating publisher Merve Verlag can be read and experienced in VR and can be received in a specially developed environment.
- Bee Virtual is a collaborative VR application in which two subjects control virtual bees by gaze direction and steer them through a virtual space. It was developed in Unity 3D for Oculus Rift. This is modeled on the physical space of the Central Laboratory Room of the Cluster of Excellence Image Knowledge Gestaltung. In an experimental setting, it was investigated how objects are remembered in VR, how virtual spaces can be transferred to physical ones, and to what extent communication among the test subjects influences the remembering of environmental properties.
PlosOne Metadata Extractor
- PlosOne Metadata Extractor is a browser plugin that extracts metadata in lists of PlosOne publications including abstracts, making them processable and analyzable for text mining.
Decide & Survive
- Decide & Survive is a game that makes it possible to experience decision-making processes in foreign policy contexts and aims to investigate the influence of user interfaces on player decisions by using both aggressive and neutral interfaces and sounds. It is developed in Java with JavaFX and runs server-side via PHP.
- OncoLogg is a game designed for multiple myeloma patients and their families to educate them about therapies, risks and side effects. It lets the player experience different therapy cycles and refocuses from healing to achieving life goals. The game was developed together with the Charité in Java for Android.
- iglos, or the Intelligent Glossary, is a semantic network design application intended to identify and facilitate terminology and comprehension problems in interdisciplinary academic projects. It is browser-based developed with web technology and RDF/OWL compatible.
Forum Junge Spitzenforscher 2013: Project Big Data for Knowledge Networks
Prof. Weidlich is a professor at the Department of Computer Science, where he leads the Process-Driven Architectures research group. The goal of his team is to support and improve the design and the analysis of process-oriented information systems (POIS) and event-driven systems (ES). POIS are software systems that help to automate, monitor, and control processes. These systems are established in various domains, from logistics, through healthcare, to infrastructure monitoring. The team’s research focuses on formal methods for behavioural modelling and verification, event-driven approaches to monitor and control processes, as well as questions related to process-driven data integration. The group further works on techniques that optimise the run-time behaviour of event-driven systems. In particular, the evaluation of event queries over streams can be designed more efficiently once regularities within event streams have been detected. In 2016, Prof. Weidlich was named Junior Fellow of the German Informatics Society (Gesellschaft für Informatik, GI) and awarded with the Berlin Research Prize (Young Scientist) by the Governing Mayor of Berlin.
- extensive experience with the documentation of business processes and in training for process management techniques
- expertise in process mining, data-driven analysis of processes in terms of qualitative (compliance requirements) and quantitative (bottleneck-analysis, management of resource deployment) properties
- know-how in scalable infrastructures for event stream processing
- for a leading US cancer clinic: analysis and improvement of clinical processes based on the data of a real-time-locating-system
- for an international oil and gas group: development of techniques for detecting irregularities in streams of sensor data
- for a well-known German manufacturer of enterprise software: design and development of add-ons for a business process modelling platform
Our daily life more and more depends on computational systems embedded in common appliances. Just think of advanced driver assistance systems in cars, medical devices, or indsutrial supervisory control and data acquisition systems. Since such systems also realise safety-critical tasks, it is all the more important to provide effective and efficient quality assurance for them. The specification, verification and testing theory group researches methods for model-based development and model checking, logical verification, and automated testing of safety-critical software. Prog. Schlingloff is chief scientist of the system quality center at Fraunhofer FOKUS, Berlin, and chairman of the boards of GFaI e.V. and ZeSys e.V.
- Major German company for communication and sensors: Student semester project for the design and implementation of a system for distributed control of indoor air quality.
The research of Prof. Leser and his group is focused on all aspects of management, integration and analysis of heterogeneous, large and distributed data sets including natural language texts (text-mining and information extraction). This encompasses subjects such as data warehouses and ETL, graph databases, deep web and semantic web, machine learning, similarity search, scientific workflows, statistical methods of data analysis as well as methods for assessing and securing data quality. The team of Professor Leser conducts research in a variety of interdisciplinary projects, especially with colleagues from the life sciences covering the range from basic molecular biology to Systems Medicine.
Very good IT-facilities:
- several state-of-the-art parallel computer cluster (20-80 CPUs, 1 TB main memory)
- cluster with 60+ cores
- 50TB+ storage
- For an IT service provider: Consultancy and prototype development in the field of master data standardisation and integration
- For an IT-manufacturer: Development and valuation of algorithms for analysing data quality
- For an international pharmaceutical company: Consultancy and development of text-mining-procedures in biomarker development
- For a medium-sized biotech company: Joint system development (partly funded by the German Federal Ministry for Economic Affairs and Energy) in the field of human genotype changes evaluation
The structure research and electron microscopy group at the HU of Berlin actively develops and applies a broad spectrum of techniques for the analysis of small volumes of material by transmission electron microscopy (TEM), scanning transmission electron microscopy (STEM), scanning electron microscopy (SEM), and also light microscopy. Know-how which we can offer to external partners is based on the development of radically new electron diffraction techniques for ab-initio determination of atomic structure in (nano-) crystalline materials, and also the development of inline holographic techniques for the TEM as well as the optical microscope. The inline holographic techniques developed within the group are applied in the TEM for mapping electrostatic potentials, charge density distributions, and also strain in semiconductors and ceramics. In the optical microscope we apply these techniques for quantitative phase microscopy of living cells or mapping the topography of surfaces with nm resolution along the optical axis.
Transmission electron microscopes:
- JEOL JEM2200FS FEG-(S)TEM equipped with an in-column energy filter for electron energy-loss spectroscopy, HAADF- and BF-STEM detectors, EDX spectrometer for local elemental analysis, NanoMegas ASTAR precession unit for phase mapping and electron crystallography, electrostatic biprism for off-axis holography, Instrument control software for Large-Angle Rocking-Beam Electron Diffraction (LARBED)
- Hitachi H-8110 (S)TEM with LaB6 cathode, equipped with HAADF- and BF-STEM detectors
Scanning electron microscope:
- Zeiss GeminiSEM 500 with a field-emission source, equipped with a multi-segment STEM detector, an EDX-Spectrometer, and an E-beam lithography unit (RAITH-Elphy)
- Within a cooperation agreement with a leading provider of optical products, Prof. Koch has developed novel techniques for analysing series of images.
Model-driven engineering raises the level of abstraction in software engineering by using models as primary development artifacts. In particular, domain-specific modelling languages can ease the transition between informally sketched requirements or designs and implementations by supporting high-level yet formal representations as a starting point for automation. Moreover, using a model-based development approach, critical system properties can be analyzed, validated and verified even before the system is actually built. Model-driven development thus leads to an increase in both productivity and quality. To some extent, model-driven engineering has made its way into industrial practice, most notably for the development of embedded systems in various domains. However, model-driven engineering does not suffice to successfully manage all challenges of modern software engineering, and actually creates new problems. Research conducted at the Chair of Model-driven Software Engineering is particularly driven by relevant challenges and problems arising from the adoption of the model-driven engineering paradigm in industrial practice.
- Experience in implementing model-based development engineering methods, techniques and processes
- Know-how regarding the set-up of model-based transformation chains (domain-specific modeling languages, model transformation and interpretation, code generation) and development environments (collaborative modeling, (co-)evolution of models, model repair and synchronization)
- Expertise in the field of version and variant management, especially customized configuration management and software product lines
- Collaboration with a Berlin-based software company on the development of innovative software architecture analysis techniques for the quality assurance of embedded systems
- Consulting for a major German automotive supplier with regard to fundamental questions of configuration management of models for the model-driven development of embedded systems
- Support of an international electrical engineering corporation with the model-based development of software components for a new generation of internet-based multimedia building communication systems
The chair of Visual Computing develops new methods for the analysis and synthesis of image and video data. This includes algorithms for estimating shape, material, motion and deformation from monocular and multi-view camera systems. Both in national and international collaborations, those algorithms are exploited in applications like multimedia, VR/AR, industry, medicine, and security.
- Various cameras
- Multispectral sensors, 3D sensors
- Lighting and calibration systems
- Development of new methods for automatic inspection and damage classification of sewer networks with a water supply company
- Development of augmented reality systems for automobile production processes with a car manufacturer
- Analysis of multispectral imaging for tissue classification in collaboration with medical technology manufacturer
G.F. Schreinzer Positronik, Steinbeis GmbH & Co. KG, Steinbeis GmbH & Co. KG, Pronova Analysentechnik GmbH & Co. KG, newtec Umwelttechnik GmbH
Biosystems engineering works at the interface between engineering and biological production processes. Prof. Schmidt and his team develop engineering solutions for a sustainable agricultural production of crops and other environmental friendly technologies. Prof. Schmidt’s research thus leads to innovative plant farming methods in greenhouses, outdoors and other intensive crop farming systems. Alternative energy supply systems (low energy greenhouses) and closed material cycles for intensive crop farming (water hygiene, sensor systems and algorithms for fully automated nutrient solution supply in closed cycles) are Prof. Schmidt’s research area. His main activity herein is the development of sensors for gas analyses, climate measurement technology and that of software supporting decision making in automation systems. Moreover, the team also provides energetic assessments in complete production systems and parts thereof as well as process analyses.
- Experimental greenhouses with energy and material flow analytics, CO2 enrichment, artificial lights and fog systems
- Plant monitors for continuous measurement of photosynthesis, transpiration, tissue temperature, stomatal conductance, climate measurement, gas analyses (Co2, ethylene), soil moisture sensors
- Freely programmable automation system for climate and process control in greenhouses
- G.F. Schreinzer Positronik, Steinbeis GmbH & Co. KG: Development of an automation system for greenhouses based on measurement details of plants (Phytocontrol)
- Steinbeis GmbH & Co. KG: National collaborative research project „The Low Energy Greenhouse“ („Zukunftsinitiative Niederigenergiegewächshaus“, ZINEG)
- Pronova Analysentechnik GmbH & Co. KG: Development of ionselective sensors for continuous recording of ion proportion in circulating nutrient solution systems; Development of measuring device to analyse phytometric reactions in plants
- newtec Umwelttechnik GmbH: Development of re-circulating irrigation system with reduced phytosanitary risk in greenhouses
At the Chair of Software Engineering, Prof. Grunske and his team specialise in methods of software technology relevant to the field of automated development and quality control of software systems. His work also involves probabilistic techniques on the basis of which the probable and less probable behaviour of a program can be modelled. This allows for easier discovery and correction of software anomalies. Such statistic models are used in the monitoring and debugging of programs during runtime as well as in testing software, which supports the development of safe and reliable software systems. Furthermore, Prof. Grunske develops methods that enable a precise definition of the quality requirements of software systems, the formalisation of verification conditions as well as the (technical) safety in embedded systems and process and performance management.
- Software engineering
- Testing and verification
- Statistics/probabilistic methods
- Formalisation of application scenarios in cooperation with TWT GmbH: “Safe.Spec: Quality control of behaviour requirements”
- Using software systems to derive probabilistic models that can be used as specification during the software engineering process: „EMPRESS: Extracting and Mining of Probabilistic Event Structures from Software Systems”
- Development of evaluation methods for probabilistic models as well as machine learning based techniques for the transformation of models: “ENSURE-II: ENsurance of Software evolution by Run-time cErtification”
Prof. Freytag holds the chair of Databases and Information Systems (DBIS). His research interests include all aspects of processing and query optimisation in (object-)relational database systems, developments related to databases (such as semi-structured or graph based data), data quality, big data analyses as well as privacy support in database and information systems. Furthermore, Prof. Freytag is involved in many cooperations using database technology for applications such as geoinformation systems (GIS), bioinformatics, physics and life sciences. In the past, he received the IBM Faculty Award four times for collaborative work concerning databases, middleware, and bioinformatics/life sciences. In 2009 and 2010, Prof. Freytag won the HP Labs Innovation Research Award for his research in the field of databases and cloud computing. He was one of the organisers of the VLDB (Very Large Data Bases) conference in Berlin in 2003, the most important international database conference. From 2001 to 2007, he was a member of the VLDB foundation (VLDB Endowment Inc.). Since 2009, Prof. Freytag has been the spokesperson of the department DBIS of the German Informatics Society (GI).
- Large IBM Server Linux/AIX with DBMS IBM DB2
- Computer cluster with 128 cores
- 30TB storage capacity
- Renowned American IT/DBMS manufacturer: improving existing database management systems (DBMSs) in the area of query optimisation; extending existing ETL tools
- Renowned American IT/DBMS manufacturer: extending DBMS functionality; designing and prototyping performance improvements in query processing; suggestions for future extension of the DBMS products
- Well known German software manufacturer: continuous consulting in the area of database systems, spe-cifically, query processing over several years to improve performance and functionality
- Well known German company: design and implementation of a query processing optimiser for the Lighweight Directory Access Protocol (LDAP) product of this company
- Consulting for various SMEs in Germany in the area of data modeling and process modeling using a state-of-the-art DBMS technology; using DBMS technology within their own products; strategic consulting for a long term use of DBMS technology