INFO-H-508 : Questions Actuelles d'Informatique : Planning 2011 : Details

8/02/2011 Data-warehousing of public genomic datasets

Resumé

There are ca. 500,000 genomic profiles freely available in the public domain. Whereas large public datasets compendia exist, they do not allow for easy re-analysis. For example, compendia can be proprietary, or in a format unusable by a given analysis software platform. The biological samples information is unstructured, or if it is structured, it is limited to predefined, incomplete ontologies. In practice, the datasets are often in a raw form and require tedious and error-prone retrieval and compilation before they can be used in visualization and analysis software platforms. InSilico DB is a web-based software platform that provides a complete solution for browsing and exporting expert-curated genomewide datasets into the analysis and visualisaiton platform GenePattern.

Orateur

David Weiss studied bioengineering and recently completed a PhD in cancer bioinformatics. He is now in charge of a project to create a Spin-Off company to provide genomic data sets management solutions to the biopharmaceutical industry.

Transparents

15/02/2011 The Intel ExaScience Project: High-Performance Computing in the Next Decade

Resumé

The talk will present the Intel ExaScience Lab and give a brief overview of the 4 workpackages (space-weather simulation, numerical toolkit and runtime layer, architectural simulation, and visualization). The focus of the talk will be on the runtime layer, presenting an automatic load balancing framework to counter runtime hardware variability used in a prototype exascale simulation application.

Orateur

Roel Wuyts is a member of the Intel ExaScience Lab in Leuven, Belgium. The main goal of the Intel Exascience Lab in Flanders is to investigate the software engineering side of exascale systems. Therefore it was decided to build a complete exascale space weather simulation application as the driver for the lab's research. The results will be used to drive the Intel hardware development for exascale supercomputers.

Within this project Roel works on the runtime management layer that sits between the actual hardware and the application. A first challenge is to find a programming and execution model that can handle hundreds of thousands of threads executed in a distributed setting where the data needs to be partitioned. A second challenge is that the software needs to be able to cope with hardware unreliability issues (surviving hardware failures during the program execution) as well as uncontrolled hardware variability (hardware slowdowns at runtime), without much impact on overall performance.

Transparents

22/02/2011 Participatory noise mapping

Resumé

Within the brand-new BrusSense research project we are investigating how one can use participatory sensing to manage pollution in our common environment, the atmosphere. Participatory sensing appropriates everyday mobile devices such as cellular phones to form interactive, participatory wireless sensor networks that enable public and professional users to gather, analyse and share local knowledge. In this talk I will explain how to develop and apply this technique for urban noise mapping. Creating pollution maps through participatory sensing constitutes a new, scalable approach almost orthogonal to the current governmental and EU-regulated techniques for pollution mapping, which rely on simulation models. In our setting sound levels are measured on mobile phones through the NoiseTube application, which also arranges geo-temporal tagging of data. Measurements are gathered and visualised at the project's website, which interfaces with Google Maps to represent the paths that users walked, colour coding for sound levels measured. While NoiseTube has been used for some time by individual users, it is unclear how measuring campaigns should be set up, how this collectively gathered data could be distilled into a single noise map, and what the quality of these maps would be. I will explain the experiments we set up in Antwerp and Brussels to investigate these issues and discuss our preliminary findings. To our knowledge this is the first experiment of this kind (one street segment has been mapped before, however, we map areas of up to 1 km^2).

Orateur

Dr. Ellie D’Hondt is a post-doctoral researcher at the VUB’s Computer Science department. She holds degrees in Physics, Computer Science and Mathematics from the VUB and has a keen interest in multidisciplinary research. After a chapter in quantum computing she has recently decided to align her research with the worldwide sustainability effort, in particular focussing on participatory sensing techniques for mapping pollution in the environment.

Transparents

1/03/2011 Programmation de processeurs massivement parallèles

Resumé

Depuis quelques années, le concept de GPGPU (General Purpose computing on the Graphics Processing Units) a pris de plus en plus d’importance. Cette technologie a permis aux chercheurs et aux développeurs d’exploiter la puissance de calcul des nombreuses unités de calculs présentes dans les dernières générations de cartes graphiques. De la sorte, un grand nombre d’algorithmes sujets à la parallèlisation massive ont pu profiter de gains en performances ainsi que de meilleurs performances par dollar et par watt par rapport à leurs implémentations exclusivement basées sur les traditionnels cpu. Depuis maintenant quelques années, une grande variété d’industries et d’applications ont donc pu profiter avec succès des applications de cette technologie. L’imagerie médicale, les sciences de l’environnement, la simulation de fluides dynamiques, la finance ou encore le ray-tracing ne sont que quelques exemples de domaines dans lesquelles le gpgpu a pu faire ses preuves.

Orateur

Thibault Hubert obtient son diplôme de licencié en informatique en 2007 à l’ULB. En 2008, il rejoint « DS Improve », une société de services informatiques basée à Bruxelles. En tant que développeur R&D, il travaille sur différents moteurs graphiques ainsi que sur le streaming d’environnements 3D. Pour cela, il se spécialise dans la programmation orienté objets, la programmation de shaders, les technologies GPGPU, l’analyse technique et l’encadrement d’étudiants.

Transparents

15/03/2011 Turning data into value: Predictive Analytics in Marketing

Resumé

In the current business landscape, an increasing number of companies use the information in their databases as a source for competitive advantage. Using historical databases, companies predict how to optimally interact with prospects, customers, and lost customers. In this session, we provide a current state of the domain, and we illustrate the learnings by means of a case study at a leading business-to-business retailer.

In this case study, we demonstrate how Predictive Analytics optimizes customer relationships. We show how replacing an existing segmentation scheme with accurate predictive response models can result in a 10% revenue increase. Additionally, we illustrate how companies succeed to deliver 300% more relevant communications by offering personalized promotional content to every single customer.

Orateur

Dr. Geert Verstraeten is Partner at Python Predictions, a Belgian niche player with expertise in the domain of customer intelligence. He currently has 10 years of “hands-on experience” in different industries such as retail, mail-order, telecom, banking, utilities and subscription services. More specifically, his interests lie in delivering highly performing yet interpretable predictions of future individual customer behavior. In 2005, Geert obtained a PhD in Applied Economics at Ghent University, Belgium. His thesis is entitled “Issues in Predictive Modeling of Individual Customer Behavior: Applications in Targeted Marketing and Consumer Credit Scoring”.

Transparents

22/03/2011 Dynamic Program Analysis for Database Reverse Engineering

Resumé

The documentation of a database includes its conceptual schema, formalizing the semantics of the data, and its logical schema translating the former in an operational database model. Important processes as database and program evolution must rely on an accurate database documentation. In many cases, however, this documentation is missing, or, at best, incomplete and outdated. Database redocumentation, a process also called database reverse engineering, typically involves the elicitation of implicit schema constructs. The most standard discovery technique of these implicit constructs is the static analysis of application programs, and, in particular of embedded SQL statements. Unfortunately, the increasing use of dynamic SQL in modern applications development often makes such techniques helpless. This talk elaborates on the use of dynamic program analysis for database reverse engineering. It illustrates and compares possible techniques for (1) capturing SQL query execution traces at runtime and (2) extracting implicit schema constructs from those traces. It reports on a comparative experiment evaluating the usefulness of the presented techniques. This talk is based on joint work with Jean-Luc Hainaut and Jean-Roch Meurisse.

Orateur

Anthony Cleve is a post-doctoral researcher at University of Namur (FUNDP) and a part-time lecturer in database engineering at Université Libre de Bruxelles (ULB). Anthony's research interests include software maintenance and evolution, software and database reverse engineering, program analysis and transformation, software product line engineering, self-adaptive and context-aware systems. He is co-author of more than twenty peer-reviewed publications in international venues in software and information system engineering. He was the recipient of the IBM Belgium 2010 Award for the best PhD thesis in computer science and applications. Anthony serves and has served in the organizing and program committees of several international conferences and workshops, including ASE, CSMR, ER, ESEC/FSE, ICSM, IWPSE, SLE and WCRE. He is co-chair of the ERCIM Working Group on Software Evolution and steering committee member of the International Workshop on Principles of Software Evolution (IWPSE).

<TRANSPARENTS>

29/03/2011 L’information géographique au 7ème ciel

Resumé

Nés dans les années 80, les logiciels de cartographie numérique sont longtemps restés un domaine à part de l’informatique. Souvent, jusqu’au début du 21ème siècle, ces systèmes ont été exploités par des équipes indépendantes des services informatiques. La situation a commencé à évoluer vers la fin des années 90 avec l’apparition des premières solutions web. Ces 10 dernières années, plusieurs événements sont venus bouleverser ce secteur : libéralisation du signal GPS, arrivées de solutions OpenSource, etc. Le monde informatique et plus particulièrement celui du web a investi celui de l’information géographique. Le web 2.0 et l’informatique dématérialisée viennent ouvrir de nouvelles perspectives à un secteur en pleine mutation. Au cours de ce séminaire, nous parcourrons l’évolution de ce secteur et aborderons les nouveaux débouchés offerts par les évolutions technologiques actuelles. Le positionnement des activités du Centre d’Informatique pour la Région Bruxelloise sera également abordé.

Orateur

Eric Auquière (http://be.linkedin.com/in/auquiere) Ingénieur agronome de formation, Eric Auquière travaille dans le domaine de la géomatique (télédétection, Global Positionning Systems et SIG) depuis 17 ans et a réalisé sa thèse de doctorat à l’UCL sur le traitement des images satellitaires. En 2000, Eric Auquière entame une carrière dans l’IT en tant que chef de projet spécialisé en GIS (I-MAGE Consult, Capgemini). Engagé par le Centre d’Informatique pour la Région bruxelloise en 2007, il encadre une équipe de 15 personnes chargée du développement de portails internet et d’applications cartographiques pour les institutions régionales.

<TRANSPARENTS>

5/04/2011 Fraud detection in practice

Resumé

Starting from a strategic positioning of Fraud Detection in the complete Fraud Risk Management process, the presentation will work its way down to specific Fraud scenarios and their Detection mechanisms. Along the way, we will touch upon the requirements efficient and effective Fraud Detection poses in terms of - Organization (Operating Model, skill sets, …), - Hardware, and - Software architecture Real life examples will illustrate some of the hurdles we faced over the years.

Orateur

Serge Waterschoot’s professional career started out at the VUB in both the Artificial Intelligence and Statistics domains. This gave him the theoretical basis for his work in Fraud Detection and Data Mining at MasterCard Europe and Belgacom afterwards. Since 2001, he works for ATOS Wordline (formerly Banksys) where he combines both professional passions as the head of the Business Analytics Competence Center, a small team of Data Management, Data Mining, and Reporting experts turning the raw data available within ATOS Wordline into value for the company and its customers.

<TRANSPARENTS>

26/04/2011 Next Challenges of Cloud Computing

Resumé

The cloud infrastructure has been becoming one of the hottest topic in IT these last two years. However, there are still a lot of technical challenges that must be overcame. In this presentation, we will first describe the cloud infrastructure and the different layers which compose a cloud. Afterward we will dig into the technical challenges that must face those architectures, especially the SLA management, the multi-tenancy, the governance and finally the Standardization and interoperability.

Orateur

Sabri SKHIRI leads the Euranova R&D department and manages internal research projects, back office R&D requests, technological watch for customers, technical assessments and innovation forum within Euranova. In addition, as an expert consultant, he leads the European R&D Software architecture team of Huawei, a multi-national Telecom Provider, in which he leads various project around Service Delivery Platform, high performance messaging and integration frameworks, cloud infrastructure and service management.

Transparents

3/05/2011 La Qualité Logicielle en Environnement Critique

Resumé

La qualité logicielle est une préoccupation importante des sociétés dont l’activité est le développement logiciel, tout particulièrement pour les logiciels utilisés en environnement critique. Les avancées du génie logiciel ainsi que l’accumulation d’expérience de ces 20 dernières années ont permis l’élaboration de techniques et méthodes permettant de produire de façon reproductible et mesurable des logiciels conformes aux normes de qualité exigées.

Au cours de ce séminaire, nous définirons ce qu’est la qualité logicielle et définirons les enjeux associés. Nous présenterons ensuite les acteurs de la qualité logicielle en les situant dans le temps et l’espace d’un projet informatique. Enfin, nous présenterons quelques techniques et méthodes couramment employées comme standard de fait dans l’industrie logicielle en 2011.

Orateur

Jean-Marc Noury est ingénieur informaticien. Il a collaboré pendant 6 ans avec Dassault et GTM à la réalisation d’applications informatiques pour les métiers des travaux publics : modélisation numérique de terrain, guidage d’engins de terrassement par GPS. Depuis 2002, il exerce chez Temenos Belgium en tant qu’Architecte Logiciel (Solution Architect) et Technical Project Manager. Ces dernières années, il concentre son expertise à la réalisation de solutions logicielles destinées à la lutte anti-terrorisme et anti-blanchiment d’argent dans le secteur bancaire.

Maurice KERN (CEO Temenos Belgium). Titulaire d’un Doctorat de 3eme cycle en Mathématiques de l’université de Paris, il a mené toute sa carrière dans le monde informatique et en particulier dans les “Financial Services”. Successivement chez Dassault Aviation, Matra, FileNet et Siebel Systems, Maurice Kern dirige Temenos Belgium, le leader mondial du STP et de l’AML.

Transparents

10/05/2011 Statistical Modeling of the Financial Markets and Systematic Trading

Resumé

Systematic hedge funds are investment funds trading financial assets using automated strategies based on mathematical models. BlueCrest has a solid track record in the systematic category, managing more than 12 billion dollars with three strategies trading a large variety of financial instruments.

The purpose of this talk is to give an overview of the principal financial instruments and of some basic investment models. We start with a passive investment strategy which illustrates the notion of risk premium. Then, we introduce some families of forecasting techniques of the asset price dynamics. Next, we focus on the risk allocation problem and we review the optimal portfolio theory due to Nobel Prize Markowitz. Finally, we will talk about systematic trading in practice by showing how it is organized in a hedge fund like BlueCrest.

Jérôme Callut

Jérôme Callut has been a statistical modeler at BlueCrest Capital Management since 2008. His research involves creating mathematical models for trading financial instruments in a systematic and automated way. Before this, Jerome was a consultant in data mining at Vadis Consulting where he focused on regression models used in Business Intelligence prediction tasks. Jerome holds an MSc degree in Computer Science from the Université Libre de Bruxelles and a PhD in Applied Sciences from the Université catholique de Louvain.

<TRANSPARENTS>

17/05/2011 Building Software that Rocks: dynamic programming languages in the agile workplace

Resumé

For some years already, dynamic programming languages experience a revival in both academia and software industry. For academia, they are excellent programming language research and teaching vehicles. For agile development entrepreneurs, they are key to delivering adaptable software solutions in record time. The increasingly accepted practice of agile development necessitates the creation of malleable solutions that can cope with frequent changes in requirements. However, those ever changing requirements tend to degrade the quality of the software implementation when no appropriate quality assurance measures are taken. Therefore, dynamic programming languages and appropriate software quality assurance techniques are two essential techniques that any agile software developer should master. The talk will demonstrate the malleable power of dynamic programming languages and introduce a number of matching quality assurance techniques, using examples from both academia and industry.

Orateur

Johan Brichau has over 10 years of research experience in advanced software engineering techniques such as aspect-oriented programming and generative programming. He holds a PhD in computer science from the Vrije Universiteit Brussel and was a post-doctoral researcher at the Université catholique de Louvain, focusing on software quality techniques, until the end of 2009. Johan is now founder and software architect of the young technology startup Inceptive. Throughout both his academic and his fresh industrial career, Johan has been a strong practitioner of dynamic programming languages and an agile software engineering attitude. At Inceptive, he leverages these techniques to deliver innovative software solutions. A prime example of such software, Yesplan – web-based event planning software, was recently launched to key customers and harbored in a joint venture by Inceptive and Arts Centre Vooruit.

<TRANSPARENTS>

 
teaching/infoh508/planning2011/details.txt · Last modified: 2011/05/10 16:02 by ezimanyi