Subscribe

Logo Bits&Chips Event

12 October 2023

Van der Valk Eindhoven-Best

Join our exhibition and conference on challenges in complex software engineering, high-tech machine learning and system architecture

Gold sponsors

Silver sponsors

Bronze sponsors

Logo Bits&Chips Event

Program

The inspiring conference program features three plenary keynotes and five parallel sessions, on software evolution, software efficiency, software robustness, software intelligence and system architecture.

Registration

9:45-10:25

Keynote

Chair: Tanja Vos (Open Universiteit)

Break

11:00-12:25

Lunch

13:45-14:25

Keynote

Chair: Tanja Vos (Open Universiteit)

14:30-15:55

Break

16:30-17:10

Keynote

Chair: Tanja Vos (Open Universiteit)

Drinks

Logo Bits&Chips Event

Audience

The Bits&Chips Event brings together (technical) management, engineers and researchers in complex software and system development.

Visitor profiles

Managing directors • Technical managers • Team leads • Project managers • System architects • Software architects • Software developers • Solution providers • Researchers • Technology innovators

Target industries

Aerospace • Agro & food • Automotive • Consumer electronics • Defense • Factory automation • Healthcare • Industrial systems • Logistics • Semicon

Past attendees

Alten • ASM • ASML • Bosch • Canon Production Printing • Capgemini Engineering • Demcon • ICT Group • Imec • Kulicke & Soffa • Lely • Lightyear • Nearfield Instruments • Neways • Nexperia • NXP • Philips • Priva • Prodrive • Signify • Sioux • Technolution • Thales • Thermo Fisher Scientific • TMC • Tomtom • Vanderlande • VDL

Logo Bits&Chips Event

12 October 2023

Van der Valk Eindhoven-Best
Eindhovenseweg - Zuid 144
5683 PX Best​
The Netherlands

At Techwatch, we can handle all of your marketing and content needs for the high-tech industry. We offer content services, organize (online) events (and offer them as a service) and we create and host webinars, podcasts and videos.

Have a look at our mediakit for more information and services.​
Contact event organization: events@techwatch.nl
Contact conference program: Nieke Roos

Other events

Logo Bits&Chips Event

Nijs van der Vaart (Echopoint Medical)

Driving disruptive innovation in a medical-device startup

15:15
15:55

Innovation in a medical startup requires a broad range of disciplines to progress simultaneously with a limited budget. It all starts with a breakthrough idea of how to solve an unmet clinical need. This proposition needs to be finetuned and made financially viable. This goes hand-in-hand with selecting the right technology and architecture while finding matching development and production partners. Everything you do needs to be in line with medical ISO standards as clinical testing of the final device and regulatory approval are needed for market access.
Echopoint Medical, a UK university spinout, develops an easy-to-use solution to disrupt the treatment of cardiac diseases. The system uses a sterile flexible catheter to diagnose microvascular diseases in the heart by measuring blood flow in the small heart vessels during surgery using a novel fiber-optic sensor and an optoelectronic console. The choices made in this startup will be outlined and compared to innovation models in large companies.
Nijs van der Vaart holds a PhD in semiconductor quantum physics from Delft University of Technology. He worked at Philips Research on display technologies and optical devices and then moved into the medical device industry. During 10 years at Philips Healthcare, he held various management roles in innovation, clinical science and business incubation in interventional cardiology and radiology. He brought several medical products all the way from initial lab ideas to market introduction. In 2019, he joined Echopoint Medical in London to develop a device for measuring blood flow in the heart and he recently started Bcon Medical in ’s-Hertogenbosch to help surgeons locate tumors during surgery.
Logo Bits&Chips Event

Richard van de Laarschot (Capgemini Engineering)

Engineering robustness through hyper-automation

11:45
12:25

Robust software can continue to function correctly and reliably during unexpected situations and is tolerant of all kinds of faults. Therefore, we can regard robustness as a quality property we can and should assess with quality checks. Software development speed has become a company’s survival factor to distinguish itself from the competition. Growing software stacks, faster product deliveries, growing feature needs, faster turnover of engineers, and so on, are increasingly hampering the software development process. We see hyper-automation as the key differentiator in the software engineering process. It enables a strategy of preventing and finding faults as early as technically possible, through the use of formal modeling languages, code generation and automated verification and validation. We believe that with hyper-automation, our clients can greatly improve the robustness of the software systems they develop. No matter the industry or the application, hyper-automation helps organizations achieve and deliver quality software in a more timely fashion while reducing the cost of delivery.
Capgemini Richard van de Laarschot
Richard van de Laarschot is a chief solution architect at Capgemini where he is the thought leader of the Center of Excellence Quality Assurance. He holds a BSc in software engineering and business informatics from Fontys University of Applied Sciences in Eindhoven. He started his career over 30 years ago in the development of embedded hard real-time software in the medical domain. It was during this time that he first learned about all aspects of testing and the value of quality above quantity in software engineering. For the past 15+ years, he has been active in the high-tech industry. He focuses on improving the quality and automation of the software engineering process for several of Capgemini’s clients by helping them develop a quality assurance strategy to achieve key business objectives and deliver higher-quality software in a shorter time and at lower costs.
Logo Bits&Chips Event

Hristina Moneva (Canon Production Printing)

Holistic approach to digitally transforming (production printing) product development

11:45
12:25

At Canon Production Printing, we have 20+ years of experience in modeling (parts of) our products and their development, started by local champions and being professionalized on department level. Recently, we initiated a new phase in scale-up and integration of digital transformation (unifying all the tools we use for developing our products: physical setups, modeling, data science, AI and digital twinning) throughout our entire R&D. In this talk, well share our ambitious approach and thinking framework for integrating people, organization, processes and technology aspects, while sharing some lessons learnt on the way. 

Hristina Moneva is a system architect on digital transformation at the System Level Modeling department of Canon Production Printing R&D, where shes one of the people responsible for developing the new technological and organizational strategy for full-scale digital transformation of the product development. She started working there in 2016 as modeling expert, being responsible for scaling up the model-based way of working within the System Development departments. Before that, she worked as a research fellow at TNO-ESI. Her applied research encompassed complex multi-disciplinary modeling projects at various high-tech companies, such as ASML, Thermo Fisher, Vanderlande and Canon Production Printing. 

Logo Bits&Chips Event

Dirk-Jan Swagerman

Better together: how system and software engineers can cooperate to innovate faster

14:30
15:10

The relationship between system and software engineering is often obscured by misconceptions and false dichotomies, such as the notion that Agile and waterfall methodologies are mutually exclusive. The introduction of Agile development led to adaptability, iterative progress and rapid response to change, but sometimes at the cost of poor engineering practices and higher maintenance costs. In this talk, Dirk-Jan Swagerman will address how best practices from both systems and software engineering enable the development of customer-centric products in a cost-effective maintenance. Dont miss this opportunity to learn how to integrate the strengths of both disciplines and innovate faster and more effectively. 

Having seen the full spectrum of system and software development, from embedded in complex systems, web applications and image processing involving AI algorithms, Dirk-Jan Swagerman understands that software is both an asset and a debt. As an independent consultant, he helps businesses transform legacy code and improve agility and he trains teams in systems and software architecture and integration. 

Logo Bits&Chips Event

Jelena Marincic (TNO-ESI)

How to manage scope when boundaries are expanding?

11:00
11:40

A system architect connects customer needs and business drivers with technical decisions. Recently, the scope of drivers has been expanding to social systems as well, touching on areas such as sustainability, data privacy and ethics. 

In this talk, I’ll focus on communication between the system architect and business stakeholders when forming a joint picture and long-term design strategy. When these experts collaborate, they see the world through the frameworks, models and techniques of their own disciplines. How do we know when to use which technique or framework? 

I’ll use the example of architecting for environmental sustainability and circularity of OEMs to explore these questions and show the first steps in answering them. 

I’ll use the example of architecting for environmental sustainability and circularity of OEMs to explore these questions and show the first steps in answering them. 

Jelena Marincic is a senior research fellow with TNO-ESI. She conducts applied research in an industrial context in the domains of systems architecting and model-based systems engineering. Currently, shes the technical lead in an ESI industrial project with the topic of reference architecting for sustainability. Before joining TNOESI in 2019, she worked six years as a model-based software design expert in Altran (now Capgemini Engineering). Her primary role was to support the introduction of model-based software techniques to ASML. Before that, she worked as a researcher and a software engineer. The common denominator of her career has been the topic of designing good quality models that reflect the multidisciplinary nature of systems. 

Logo Bits&Chips Event

Bram Verhoef (Axelera AI)

Redefining development: AI’s generative solutions for tomorrow

14:30
15:10

Until recently, artificial intelligence (AI) primarily focused on classification and regression tasks, such as visual object identification and detection. These machine learning techniques excel at distinguishing between data distributions, making them invaluable assets in diverse fields like manufacturing, biomedicine, surveillance and retail. However, the true potential of AI lies not only in discerning data distributions but also in generating new data points from them. Enter generative AI. Generative AI empowers users to create imaginative images, text, audio, videos, code and more. These methods are poised to revolutionize our approach to development, automating mundane tasks and fostering rapid innovation. With the immense power of generative AI at your fingertips, the opportunities are limitless. Join us as we embark on a journey through the realm of generative AI, exploring cutting-edge algorithms and real-world examples of AI transforming industries while unlocking unprecedented levels of efficiency and creativity. 

Bram Verhoef has a background in statistics, psychology and neuroscience. After receiving his PhD in 2010 from KU Leuven, he conducted post-doctoral research at Harvard University and the University of Chicago, focusing on the computational neuroscience underlying attention mechanisms. In 2017, he returned to Belgium to work at Imec as a Principal Member of Technical Staff, leading the algorithm development related to a novel analog compute-in-memory deep learning chip. In 2021, he co-founded Axelera AI and is currently head of machine learning, leading the algorithm optimization efforts for Axelera AI’s state-of-the-art deep learning accelerator. 

Logo Bits&Chips Event

Erik Meijer (Meta/Open Universiteit)

The Programmer’s Apprentice Season 2: Advancements and future directions in AI-assisted coding

15:15
15:55

In 1976, Charles Rich and his colleagues pioneered the concept of a programmer’s apprentice – an interactive programming system designed to assist expert programmers in the design, coding and maintenance of large and complex programs. The idea of a digital prosthesis, extending our biological brain to seamlessly bridge the gap between our ideas and the code we produce, has been revisited multiple times throughout the years. For instance, the Probability/Bigcode team has spent over half a decade applying machine learning to enhance efficiency for engineers, data scientists and systems across the board. 

Historically, these endeavors relied on traditional (symbolic) AI techniques or early-stage neural net models, resulting in limited success in meeting the high expectations. However, the emergence of generative models like GPT has transformed what was once considered science fiction into reality, rendering the need for custom model architectures and embeddings obsolete. Consequently, many developers are now incorporating AI-based ‘co-pilots’ into their daily programming routines. 

In this talk, we’ll explore the various ways AI has been applied at Meta, such as code search, code recommendations and bug fixing, and revisit these areas in the context of large language models (LLMs). Additionally, we’ll discuss our vision for the future of generative AI in productivity tools, pivoting from a collection of task-specific tools designed for a generic user base to a single, user-specific tool capable of tackling a multitude of tasks. This transition will illuminate the ongoing evolution of AI-assisted programming and its potential impact on the developer community and beyond. 

Erik Meijer is a Dutch computer scientist and entrepreneur. From 2000 to early 2013, he was a software architect for Microsoft where he headed the Cloud Programmability Team. He then founded Applied Duality Incorporated. Since 2015, he’s been a director of engineering at Facebook. He received his PhD from Nijmegen University in 1992. His research has included the areas of functional programming (particularly Haskell) compiler implementation, parsing, programming language design, XML and foreign function interfaces. Recently, he joined the Open Universiteit.
Logo Bits&Chips Event

Yolanda van Dinther (Thermo Fisher Scientific)

Applied data science in electron microscopy: why data is the new oil?

11:00
11:40

Thermo Fisher Scientific is the world leader in serving science, making the world healthier, cleaner and safer. The Electron Microscopy division provides instruments that can be used to see nanoparticles and atoms. Both in research labs and in industrial environments, our customers use these microscopes to go from questions to usable information. In this process, data sets as well as smart software solutions are essential. In this talk, the value of data, both scientific data and data produced in and around the instruments, will be explained and demonstrated, using examples of applied data science. 

Yolanda van Dinther is Director Software Development in the Electron Microscopy division of Thermo Fisher Scientific, leading the digital transformation with a focus on unlocking and using the power of big and large data as well as finding robust ways to update and upgrade the systems, software and AI models. Before joining Thermo Fisher Scientific five years ago, Yolanda worked in various innovation leadership roles, ranging from software to system development across a variety of high-tech companies in the healthcare, semiconductors, consumer, automotive, life sciences and printing industries. She holds an MSc degree in computer science from Eindhoven University of Technology and a post-bachelor’s degree in counseling psychology. 

Logo Bits&Chips Event

Petra Heck (Fontys)

A quality model for trustworthy AI systems: data, models and software

11:45
12:25

Building production-ready AI systems entails much more than just training a high-performance machine learning model. A production-ready AI system needs to be trustworthy, ie of high quality. But how to determine this in practice? For traditional software, ISO 25000 and its predecessors have since long time been used to define and measure software product quality characteristics. This talk introduces a quality model for AI systems, based on ISO 25000. This quality model defines 9 characteristics of a trustworthy AI system that arent represented in the software product quality characteristics of ISO 25000. To illustrate our quality model, we apply it to a real-life case study: a deep learning platform for monitoring wildflowers. We show how the quality model can be used as a structured dictionary to define quality requirements for both data, models and software. Ongoing work is to extend the quality model with metrics, tools and best practices to build a quality toolbox for engineering trustworthy AI systems. 

Petra Heck studied computer science at Eindhoven University of Technology and holds a PhD from Delft University of Technology (Quality of Just-in-Time Requirements). Shes been working in ICT since 2002 and as a lecturer in software engineering at Fontys Applied University since 2012. In 2019, she started a postdoc research on the topic of AI engineering: how to build production-ready machine learning systems? Since 2022, shes part of the new Fontys-wide Kenniscentrum Applied AI for Society as senior researcher AI engineering, involved in applied AI projects as “Van sensoren naar zorg,” with teachers and students of Fontys. 

Logo Bits&Chips Event

Wim van Broekhoven (Malvern Panalytical)

Addressing product verification: when large language models and SAFe don’t feel so ‘safe’

14:30
15:10

No process can assure product success. This talk isn’t about advocating for the ‘best’ process. Instead, it focuses on what a process brings, including automatic product verification. This exploration is based on the tried-and-tested methods within Malvern Panalytical, our journey to this point and the ongoing journey ahead. 

Commercial processes are often presented as the magic formula for guaranteed success, accompanied by an offer of consultants for support. However, I firmly believe in the statement “If you don’t consider waterfall, you’re not truly agile.” A process isn’t merely a sequence of steps, roles, responsibilities and ceremonies. It encompasses fundamental principles, clear communication, disciplined practice, customization and the freedom to operate within established guidelines. 

A process should lay the foundation for both reuse and automatic product verification. In particular, product verification needs to advance in response to the projected 20-55 percent increase in productivity from the utilization of tools that employ large language models. 

Wim van Broekhoven has over three decades of experience in technical automation. His professional background is in software engineering and product verification, having worked with diverse organizations across various countries. He’s demonstrated leadership skills, adeptly coaching international teams comprised of individuals from varied native cultural backgrounds. 

Wim has played different roles within software engineering, ranging from a hands-on programmer to a verification expert, project management manager and discipline leader. This diversity of roles has given him a pragmatic perspective on development processes, software reuse, integration, quality control and collaboration. 

Currently, Wim is actively involved in the integration of cutting-edge software technologies at Malvern Panalytical, a developer of equipment and services for material characterization. These technologies include tools that leverage the potential of large language models. 

Throughout his career, Wim has witnessed the emergence of numerous software processes. In response to these changes, he’s expertly guided his team in evaluating and integrating these novel concepts based on their relevance and practicality. This thoughtful process has resulted in an engineering process currently used within Malvern Panalytical that emphasizes the importance of a uniform automatic product verification mechanism as a crucial aspect in the development of complex technical products. 

Logo Bits&Chips Event

Gwen Calluy (Alten)

Cracks in quality

15:15
15:55

I’ve been solving quality-related problems for many different companies for the past ten years. Although the difficulties are often similar, the solutions are always custom made. However, there’s one common thread: all requests for help arrived so late that there was already significant delay for the product, development slowing down and customers losing confidence. The later in the project a problem is solved, the more expensive and exhausting for the development team it becomes to deal with. 

In this talk, I’m going to show you how, at management level, you can spot those initial cracks in the quality and how those, when unattended, will make you lose control over the timeline of your project due to large-scale debugging, unexpected rework and large, repeated test cycles. By gathering just a few metrics, we can spot the majority of defects when they occur and are still easy to fix. Although the main point of my talk is to give you tools to spot the cracks in the quality, I intend to also leave you with inspiration for your own, unique solutions. 

After graduating from her geology study, Gwen Calluy applied her knowledge of modeling as a scientific software engineer at Alten. Over time, she became fascinated with the concept of quality and how that could be tested. In the following years, she built several test teams from scratch, started teaching ISTQB, coached customers and consultants, and assessed many companies to help them improve the quality of their products and processes. In her current role as a technical manager, she brings her experience to a wide range of topics, but always with a long-term quality strategy in mind. 

Logo Bits&Chips Event

Ana-Lucia Varbanescu (University of Twente)

Toward zero-waste computing

15:15
15:55

‘Computation’ has become a massive part of our daily lives; even more so, in science, a lot of experiments and analyses rely on massive computation. Under the assumption that computation is cheap and time-to-result is the only relevant performance metric, we currently use computational resources at record-low efficiency. In this talk, I argue that this approach is an unacceptable waste of computing resources, and propose ways to estimate this waste. I’ll provide insights and ideas for quantifying computing waste in terms of lack of efficiency, will further introduce performance engineering methods and techniques to reduce waste in computing. By means of a couple of case studies, I’ll also demonstrate performance engineering at work, proving how efficiency and time-to-result can be happily married. Finally, I’ll propose a strategy for system co-design to enable zero-waste computing by construction.
Ana-Lucia Varbanescu holds BSc and MSc degrees from Politehnica University in Bucharest, Romania. She obtained her PhD from Delft University of Technology and continued to work as a postdoc researcher in the Netherlands, at TU Delft and VU University Amsterdam. She’s a MacGillavry fellow at University of Amsterdam, where she was tenured in 2018 as associate professor. Since 2022, she’s a professor at the University of Twente. She’s been a visiting researcher at IBM TJ Watson, Barcelona Supercomputing Center, Nvidia and Imperial College London. Her research stems from HPC and investigates the use of heterogeneous architectures for high-performance computing, with a special focus on performance and energy efficiency modeling for both scientific and irregular, data-intensive applications. Her latest research focuses on zero-waste computing and systems co-design.
Logo Bits&Chips Event

Kris van Rens (High Tech Institute)

Safety and security in systems programming

11:00
11:40

As the world grows to become a more connected system, the need for strong safety and security requirements in systems programming increases. A language like Rust offers a direct advantage here, as it guarantees memory safety, even in concurrent contexts. But theres more to systems programming safety and security than memory safety alone. Additionally, theres type safety, initialization safety and numerical safety, just to name a few. Due to increased relevance, as well as languages like Rust rising in popularity, existing languages have to adapt in this perspective. What categories of safety and security are there? How are relatively old languages like C++ being updated to accommodate for safety? And are they even accommodating for this at all? Find out as we are going to discuss some of the many interesting aspects of safe and secure systems programming. Ill demonstrate and compare the various options available in common languages used for high-performance systems programming like C, C++ and Rust. 

Ever since the first time Kris van Rens got in touch with his dads 1983 ZX Spectrum, he was captivated by the wonderful world of computer programming. In 1995, he learned to program Pacman in x86 real-time assembly, which was soon followed by learning C and then C++ and Rust, which came to be his bread and butter. Hes very serious about code quality and is mostly interested in C++, Rust, Linux, programming languages/paradigms, software architecture and performance optimization. He currently works as the lead developer at Vinotion and as a trainer at High Tech Institute in Eindhoven, the Netherlands. Kris studied mechatronics at FH Niederrhein, Krefeld, Germany, and electrical engineering at Eindhoven University of Technology. 

Logo Bits&Chips Event

Richard Doornbos (TNO-ESI)

AI, Ai, ai! From digital twin to optimized system

14:30
15:10

For many cyber-physical systems (CPSs), it’s essential to be optimized during operation, but it’s very challenging to optimize them. Since CPSs are internally complex and hard to understand, it often requires highly qualified and scarce expert operators. Artificial intelligence (AI) and in particular machine learning could play an important role in reducing the cost and effort of CPS optimization. 

These CPSs are also limited available, sometimes slow and expensive to operate, thus making it impractical to train AI directly on the systems. This problem could be overcome with digital twin (DT) technology. 

The Asimov ITEA project investigates the creation of digital twins of CPSs to train reinforcement learning agents (a type of ML) that can then be used to optimize the CPS. We investigate the following questions: can AI be used to optimize complex cyber-physical systems? What’s the minimum viable DT to provide a realistic basis for training the AI? What are ways to train an AI on a DT and then apply it to real systems? How to engineer systems that integrate AI components? Pillars of our approach include a) prototyping in two industrial cases; b) exploratory research using experimental setups; c) analysis of architectural challenges; d) initial investigations about productization. 

We’ll present the industrial cases, key technology challenges, trade-offs and early optimization results. 

Richard Doornbos is a senior research fellow with TNO-ESI. He conducts applied research on systems architecting methods for complex high-tech systems. He currently investigates how to apply model-based systems architecting and systems engineering techniques in multidisciplinary architecting teams, with a focus on reference architectures, system modeling and aspects such as architecting skills, tool support and team cooperation. 

Logo Bits&Chips Event

Lina María Ochoa Venegas (Eindhoven University of Technology)

To evolve or not to evolve? The library-client co-evolution dilemma

14:30
15:10

The dilemma: software projects acting as libraries can either evolve their code at the potential cost of breaking their client projects or add subtle patches to maintain healthy clients at the cost of increasing technical debt. The problem: the conservatism that surrounds the unknown about the consequences of software evolution hinders the phenomenon itself. Depending on their values and priorities, developers can opt for conservative approaches such as developing libraries themselves to avoid uncontrolled dependencies or even avoid the introduction of breaking changes at all to maintain healthy clients. Previous research has shown that software evolution and, in particular, breaking changes seldom impact clients (less than 8 percent when it comes to Java projects in the Maven ecosystem). New tooling and enhanced development workflows are needed to keep both sides informed on the uses of software libraries and the impact evolution has on them. This talk shows empirical evidence that elucidates the magnitude of the problem and new research directions where open source and industry can meet. 

Lina María Ochoa Venegas moved to Amsterdam in 2017 to start a position as a PhD candidate for the European Union project Crossminer. During this time, she was affiliated with the Centrum Wiskunde & Informatica (CWI) in Amsterdam, the Netherlands. After the completion of the project, she moved to Eindhoven to work as a PhD candidate at Eindhoven University of Technology (TUE). She finalized her PhD studies in 2022 under the supervision of Thomas Degueule, Jurgen Vinju and Mark van den Brand. Currently, shes an assistant professor of the Software Engineering and Technology (SET) cluster at TUE working on software evolution and software analysis-related topics. 

Logo Bits&Chips Event

Rosilde Corvino (TNO-ESI)

Massive code analysis and refactoring: will it be for everyone?

15:15
15:55

Custom static analysis techniques have proven incredibly useful for code restructuring and analysis in software development. These techniques involve using automated tools to detect and repair coding issues like deviations from architecture specifications or rejuvenating old code patterns and old libraries’ usage. Companies like Philips and Thermo Fisher Scientific have reported successful results by implementing these techniques, including identifying critical areas in software and reducing technical debt, improving software modularity and decreasing testing and maintenance costs. 

One of the main challenges with custom static analysis techniques is that they require a high level of technical expertise to be applied effectively. TNO-ESI researchers are currently working on democratizing these techniques by developing user-friendly methods and tools that don’t require extensive parser knowledge. The techniques are designed to provide an accessible and low-cost solution, allowing all developers to benefit from the advantages of custom static analysis. They can potentially change software development, enabling more people to analyze and maintain software with less effort and greater efficiency. 

Rosilde Corvino is an IT professional with experience in embeddedsoftware development and project management. She holds an MSc in electronic engineering from the Politecnico di Torino and a PhD in micro and nanoelectronics from the Universite Joseph Fourier of Grenoble. Throughout her career, Rosilde has held technical positions in processor design, compiler design and software development, working for organizations such as INPG and Inria in France and TUE, Intel and TNO-ESI in The Netherlands. She has expertise in and has worked on projects using several programming languages, model-based design, domain-specific languages and static analysis. At TNO-ESI, shes leading research activities to define effective and efficient approaches to deal with software legacy. 

Logo Bits&Chips Event

Gabriele Keller (Utrecht University)

High-performance computing at your fingertips – making parallelism accessible using embedded languages

11:00
11:40

Not long ago, high-performance computing (HPC) used to require expensive specialized hardware. Now, laptops and desktops routinely have 8-64 CPU cores available and come with GPUs whose computing power exceeds those of dedicated HPC hardware from just a few years ago. Despite the availability of parallel hardware, its still not easy for application programmers to exploit these capabilities, as low-level knowledge is required. Parallel libraries, such as BLAS, are an important step to ease the burden on programmers, but theyre limited to particular applications and optimizations across library functions are limited. On the other hand, dedicated parallel languages can overcome these limitations, but programmers have to learn a new language. This talk discusses a third approach: parallel languages that are embedded in an existing generalpurpose language aim to provide the best of both. To illustrate the approach by way of a concrete example, well look at the architecture of Accelerate, an embedded DSL for high-performance array programming of multicore CPUs and GPUs. 

Gabriele Keller is the head of the software engineering division and the chair of the software technology group at Utrecht University. Her research interest include programming language technology, in particular for functional and embedded languages, software verification, as well as high-performance computing. She received her PhD degree in 1999 from the Technical University Berlin on the implementation of irregular data parallelism. She worked at the University of New South Wales in Australia, where she co-founded the programming language group, and moved to Utrecht University in 2018. 

Logo Bits&Chips Event

Ben van Werkhoven (Leiden University)

Automatically optimizing GPU applications for performance and energy efficiency

11:45
12:25

GPUs are fueling the current AI revolution and have dramatically reshaped the computing landscape over the past two decades. Serving as the primary driving force in training large language models, such as ChatGPT, as well as many large-scale scientific applications, GPUs have become the go-to platform for high-throughput and energy-constrained computing. However, unlocking the full computational power of the GPU is a significant challenge. Reducing the carbon footprint and optimizing the performance of GPU applications require exploration of vast and discontinuous program design spaces, which is an infeasible task for programmers to perform manually. Moreover, this search process needs to be repeated for different hardware and for various input problems, leading to productivity and maintainability challenges. This talk provides an overview of our research into Kernel Tuner, a software development tool that enables GPU programmers to create GPU applications with optimal compute and energy performance. 

Ben van Werkhoven is assistant professor at Leiden University, where he conducts research in automatic software optimization for high-performance computing. After obtaining his PhD at VU Amsterdam in 2014, he joined the Netherlands eScience Center, where he was accelerating science using GPUs as part of many collaborative projects in various disciplines. Ben is the principal investigator of Kernel Tuner, a growing ecosystem of tools for testing and auto-tuning GPU applications. He’s leading work packages in several large consortia, including NWO (NWA-ORC), EU Horizon 2020 and EuroHPC JU. He’s also a co-founder of the Netherlands Research Software Engineers community (NL-RSE). 

Logo Bits&Chips Event

Magiel Bruntink (SIG) 

LETSGO build green software!

11:00
11:40

How can software engineers build software that consumes fewer resources? Can they also reduce the resource footprint of their work and their automated tools? How can organizations measure and improve their sustainability? SIG is working on these questions together with our partners in the LETSGO project (Schuberg Philis, VU Amsterdam and others). In this talk, Ill outline our current progress on code-level energy benchmarking and automated refactoring. And then provide an update on our plans with LETSGO for hardware-validated models for green software. 

Magiel Bruntink is head of research at Software Improvement Group (SIG). At SIG, we research new integrated models of software quality, models that allow us to reason about the quality of software products, of the development processes and of the people doing the work. Magiel is an internationally published author in the field of software engineering, with 20 years of experience in research, consulting and education. He holds a PhD in computer science from Delft University of Technology. 

Logo Bits&Chips Event

Keynote: Carlijn Compen (Canon Production Printing)

The human touch: getting the hard stuff right in software development

13:45
14:25

Canon is known as one of the world’s most innovative companies. It’s primarily known for its hardware, such as cameras and printers, but this hardware can’t exist without software. At Canon Production Printing, we’re focused on accelerating digital imaging technologies and developing high-tech printing products for professional customers, ranging from creative studios around the corner to blue-chip multinationals around the globe. We believe that software adds the behavior to our products: it ties together complex functionality and makes it accessible to our customers. Moreover, software plays a vital role during product development, as it enables for example model-based development for multidisciplinary teams. In this talk, we’ll explore how we organize software development within CPP, creating optimal value for our customers as well as for our internal organization.

Carlijn Compen joined Canon Production Printing in 2007 after obtaining a Master’s degree in industrial design at Eindhoven University of Technology. In her various roles as UX designer (in the Netherlands and Japan), portfolio director and head of design, she specialized in user-centered design and development in a high-tech R&D environment. Since 2022, she’s Vice President Software Development, where she leads and defines the strategic direction of the software development discipline at CPP. She focuses on strengthening the customer and user perspective in software development and in the CPP R&D organization.
Logo Bits&Chips Event

Karoline Logman & Dennis Postma (Thales)

Where waterfall meets Agile

11:45
12:25

When a customer is traditionally working in a very waterfall way and the software department is agile, theres a need for compromise. What choices were made and why? In this talk, we present a topdown breakdown and a bottomup ”What is it like in practice” story. 

Karoline Logman started her IT career at CMG Arnhem and Enschede. She was a tester and requirements engineer for eight years. After emigrating to New Zealand, she was a QA manager for Softtech. Moving back to the Netherlands, she continued working for Softtech as product owner and project manager for a French project. When this project finalized, she became operational manager and R&D manager for three years, introducing the Agile Scrum process and guiding the Phoenix software team to a more mature organization under the Synopsys umbrella. Currently, she’s a product owner for the Tacticos software product from Thales. 

Dennis Postma started working for Thales in 2018 as a Scrum master at the IVVQ department. Prior to that, he worked as a Scrum master and Agile coach for about eight years at a software company in Enschede. At Thales, he performed combat system integration for the Mexico program on site in Salina Cruz, Mexico. After returning, he was asked to become the chief Scrum master for the next version Tacticos software development. He managed to conquer several challenges with organizing a good multi-site, multi-country development environment in which we follow correct processes and practices and use a shared set of tools. 

Logo Bits&Chips Event

Keynote: Arie van Deursen (TU Delft)

Explainable software engineering

Rethinking the full SE life cycle

16:30
17:10

In artificial intelligence (AI), it’s increasingly recognized that components that learn from data need to be explainable. In this talk, we take explainability one step further, using it as a lens to rethink the full software engineering life cycle. To that end, we consider explainability of both the software engineering process and the resulting software system. We use this to shed new light on requirements traceability, delay prediction, code review and AI-powered coding. Furthermore, we revisit software testing, interpreting test cases as executable explanations at different levels of granularity. Based on this, we envision a future of software engineering in which explainability is a first-class citizen.
Arie van Deursen is a professor in software engineering at Delft University of Technology, where he’s also head of the Software Technology department. He holds an MSc degree from the Vrije Universiteit Amsterdam (1990) and a PhD from the University of Amsterdam (1994). His research interests include software testing, language models for code, human aspects of software engineering and trustworthy artificial intelligence. He’s scientific director of the Delft Fintech Lab and co-PI of the NWO Long Term Program Robust (2022-2032) on trustworthy AI. Based on his research, he co-founded the Software Improvement Group (2000) and PerfectXL (2014). For the Dutch government, he serves on the Advisory Council for ICT Assessments (AcICT).
Logo Bits&Chips Event

Keynote: Philippa Hopcroft (Cocotec) &
Ivo ter Horst (ASML)

Test less, verify Moore

9:45
10:25

Transforming how lithography machine software is built

ASML’s lithography machines are complex cyber-physical systems of systems, designed to be extremely accurate, deliver very high throughput and operate 24/7 to deliver exceptionally reliable results. More than 10 years ago, ASML embarked on a formal verification journey to transform its software development practices with the aim of improving software quality and speeding up delivery. Unlike testing, formal verification explores all possible scenarios, and can automatically detect all those hard-to-find bugs that elude testing. With the right tools, entire classes of errors can be eliminated early and fast.

During this journey, ASML engineers introduced the Coco Platform as the next generation replacement to tackle the scale and complexity of systems required. The Coco Platform offers a number of unique strengths, which have contributed towards ASML’s successful leap forward along its formal verification journey. Today, millions of lines of code generated from formally verified models make ASML systems run reliably.

In this joint ASML-Cocotec talk, we will introduce the Coco Platform and highlight some of the challenges faced when introducing formal verification on an industrial scale. We will discuss our experiences of migrating thousands of first-generation models to Coco and preparing hundreds of engineers for this new future.

Philippa Hopcroft is one of the co-founders and CEO of Cocotec, a spinout of the University of Oxford with the mission of bringing formal-verification tools to developers across industries. She completed her DPhil at Oxford in 2001 on security protocol analysis and has been pushing the boundaries of what’s possible with formal verification in industry and academia ever since. Prior to launching Cocotec in 2019, she was a Senior Research Fellow at the University of Oxford, where she led several research projects and was instrumental in getting strong engagement from companies across sectors, resulting in the successful spinout of Cocotec.
Ivo ter Horst joined ASML in 2007 and has worked on the software of various parts of its lithography systems. He holds a Master’s degree in computer science from the University of Twente, where he specialized in software engineering and formal methods. For the last 10 years, he has been leading the Formal Software Engineering technical competence at ASML, where he guides engineering teams and focuses on making formal-verification technology accessible to ASML’s large software engineering community.