Wintersemester 2017/18

Samuel Koop

March 21, 2018, 2:00 pm, F1.110

Congestion Games mit gewichteten Strategien

In der vorliegenden Arbeit befassen wir uns mit einem Modell in der Spieltheorie. Wir werden hier ein Modell entwickeln und analysieren, dass ein verallgemeinertes Modell der ursprünglichen Congestion Games ist. Die Erweiterung, die in dieser Arbeit hinzukommt, sind gewichtete Strategien. Das bedeutet, jede Strategie jedes Spielers bekommt ein Gewicht, welches ein beliebig positiver reellen Wert ist, der Einfluss auf die Kostenfunktion der Ressourcen hat. Die Kostenfunktion bekommt als Eingabewert nicht mehr die Anzahl der Spieler, die diese Ressource nutzen, sondern die Summe aller Gewichte der gewählten Strategien, die diese Ressource verwenden.
Wir untersuchen dieses Modell im Bezug auf das Nash-Gleichgewicht. Sowohl für die allgemeinen Congestion Games mit gewichteten Strategien, als auch für die Einschränkung, dass nur lineare Funktionen erlaubt werden, finden wir ein Gegenbeispiel, bei dem man kein Nash-Gleichgewicht findet. Gezeigt wird das, indem man einen Zyklus betrachtet, der alle möglichen Zustände von Strategiewahlen beinhaltet. Mit Hilfe einer Potentialfunktion zeigen wir, dass bei Singleton Congestion Games immer ein Nash-Gleichgewicht existiert. Durch die zusätzliche Eigenschaft der gewichteten Strategien kann man bei Matroid Congestion Games die Matroideigenschaft manipulieren, dass dieses Spiel sich so verhält, wie jedes allgemeine Congestion Game mit gewichteten Strategien. Nur wenn es zwei Ressourcen gibt, kann man beweisen, dass es ein Nash-Gleichgewicht gibt. Wir zeigen, dass bei polynomialen Matroid Congestion Games mit gewichteten Strategien ein α-approximatives Nash-Gleichgewicht existiert. Dieses liefert ein annähernd perfektes Ergebnis, wenn für jeden Spieler gilt, dass die Strategien eines Spielers ähnlich große Gewichte haben.

(Final presentation Bachelor's thesis)

Bart De Keijzer
University of Liverpool

March 20, 2018, 2:00 pm, F1.110

Algorithmic Mechanism Design for Markets

In this talk I will talk about some of my recent work on mechanisms for two sided markets.
I will first briefly talk about mechanism design and algorithmic mechanism design (AMD) in general, where I will sketch the general approach, and the type of questions that we are interested in within this field. Market mechanisms have been a recent topic of interest in AMD. This may be attributed to the increasing prominence of internet applications that essentially function as market platforms, and to the interesting and challenging mechanism design questions that this gives rise to.
My own contributions to this research theme deal with designing feasible mechanisms for two-sided markets with a provably good quality in terms of social welfare (i.e., total utility) and gain from trade (i.e., increase in utility). I will focus primarily on a recent result that characterises how well a feasible mechanism can perform in a bilateral trade setting, which is the most simple version of a two sided market where there is only a single buyer, a single seller, and a single item to be traded. I will then extend this result to settings with many buyers and sellers and show that there exists a very simple mechanism with performance arbitrarily close to the theoretical optimum, as the number of buyers and sellers grows.
These results are based on joint work with Riccardo Colini-Baldeschi, Paul Goldberg, Stefano Leonardi, Tim Roughgarden, and Stefano Turchetta.

Jérôme Kempf

February 28, 2018, 2:00 pm, F1.110

Learning deterministic bandit behaviour from compositions

We consider a scenario where we have an online service composer putting together a product out of a pool of service providers that try to be competitive. Actors in online service scenarios can have different strategies when they know that a service composer will not see each individual actor’s contribution to the product. We present a new model of bandits for the Multi-Armed bandit problem that describes a simplified version of this scenario. Since it is common that consumers place a rating for the product that reflects the quality of the complete products there is room for the actors to be opportunistic. In this paper we define said model and use our simulation environment to dispatch well known learning algorithms to see how they fit in the Multi-Armed bandit problem. We analyse the performance of ThompsonSampling, the Upper Confidence Bound algorithm and ǫ-greedy in the Aggregated Multi-Armed bandit problem and try to find out their strengths and weaknesses in this setting. Learning from aggregated rewards provides a tough challenge for all algorithms due to the scarcity of information that can be gathered.

(Final presentation Bachelor's thesis)

Arne Kemper

January 31, 2018, 2:00 pm, F1.110

Pure Nash Equilibria in Robust Congestion Games via Potential Functions

Regular Congestion Games assume that all resources are available at all times. We present reasonable properties for a game that does not make such an assumption, including that such games are still Potential Games. We show that a class of games with these properties exists and that our proposed class is tight.Moreover we show that dropping the Potential Game assumption leads to games were even deciding whether they have a PNE is hard.   
We prove that these games have the same Price of Anarchy as comparable Congestion Games and that finding a PNE in them is PLS-complete. These result hold even under some strict further assumption on the structure of the games. We also show that the naive approach of finding a Pure Nash Equilibrium in these games leads to exponential runtime and, in contrast to regular Congestion Games, this cannot be improved by making assumptions on the structure of the strategy spaces.

(Final presentation Master's Thesis)

Marcel Nachtigall

January 24, 2018, 2:30 pm, F1.110

Scenario-driven Strategy Analysis in a n-player Composition Game Model

In this thesis we look at the Composition Game Model by Feldotto and Skopalik where a composer determines a set of service providers to provide a component to a composed product. The quality of the components affect the quality of the composed product. The service providers have the choice to either provide a high or low quality component and are getting rewarded with a payoff in dependence of their choice. In this thesis we use simulations of an Evolution Inspired Model to evaluate player strategies and market characteristics in this market. With the Evolution Inspired Model we identify superior player strategies and the market quality in dependence of the game design. We further identify tipping points in the game design for the superiorcy of milking strategies, which were identified and formalized by Djawadi et al. [2] in an laboratory experiment of the market.

(Final presentation Bachelor's Thesis)

Johannes Schaefer

January 10, 2018, 2:00 pm, F1.110

Using Controlled Dynamics to Improve Network Quality

Dynamics bring challenges to networks, especially if the networks are not bound to be continuously connected. We look at such ad hoc networks and search for models that allow us to dictate the dynamics of the network, when we're willing to pay a price for that. The goal is to improve metrics of the generated networks, such as the *dynamic diameter*, while at the same time keep the costs for the control over participants low.
In a first approach we consider one dimensional cyclic fields on which nodes can move. We associate with every node a spot on the cycle that is its home. Nodes 'prefer' to be close to their homes, so the further away from its home a node is, the higher the cost for algorithms to control the node.
We introduce algorithms that try to minimize the imposed cost and compare them w.r.t. to the knowledge about the system they need, as well as about the quality of the generated networks, as far as it could be observed through good sense and simulations.

Julian Kutzias

December 20, 2017, 2:00 pm, F1.110

River Generation in Extendable 3D-Terrain

Procedural generation is a useful tool for the creation of large 3D landscape sceneries. In some applications, such scenes are assembled piece by piece or are extended at runtime. This thesis presents an algorithm designed to procedurally generate rivers in terrain which can be extended by adding more areas at the borders of the scene. When this is done rivers generated prior to the extension will automatically continue into the new terrain if their path was formerly restricted by the end of the scene at the points where new landscape areas are added. The algorithm uses heightmaps to run a height based path search from a river origin point while applying water values to certain areas to subsequently erode the terrain forming river beds using these values.

(Final presentation Bachelor's thesis, talk in German)

Matthias Feldotto

December 13, 2017, 2:00 pm, F1.110

Disaggregating Consumer Evaluations

Final products or services are often compositions of a number of intermediary goods or resources. An ongoing question is in how far such intermediary goods influence the quality of the final good. In particular, in case of experience goods, customers are asked to evaluate the quality of the composed product after purchase, i.e., they evaluate the aggregate. Typically, the quality of intermediary goods is not observable and hence cannot be evaluated.
The question we tackle in this talk is whether it is possible to use consumer evaluations on final products to assess the quality of intermediary goods. As there are many consumers evaluating, two questions arise. First, how should we aggregate evaluations across customers? And second, how should we disaggregate information on composed products to arrive at a valuation of components? Finally, the aggregation and disaggregation methods should be consistent with each other.
In this work we show that we can disaggregate the evaluations of composed goods with the help of solution concepts for TU games like the Shapley value and aggregate the evaluations of different customers by using concepts of the classical social choice theory or cardinal solution concepts like (weighted) averages.

Alexander Mäcker

December 6, 2017, 2:00 pm, F1.110

Scheduling on Heterogeneous Machines with Setup Times

We consider a scheduling problem in which $n$ jobs need to be processed on $m$ parallel machines so as to minimize the makespan. The set of jobs is partitioned into $k$ classes and a setup needs to take place whenever a machine switches from processing a job of one class to a job of a different class. While previous work focused on identical parallel machines, we recently started to consider settings with heterogeneous/non-identical machines. Concerning non-identical machines, one usually distinguishes uniformly related machines, where each machine has a fixed speed; and unrelated machines, where processing times can vary arbitrarily between different machines. In this talk, we will take a look at a first simple 3-approximation for uniform machines and first insights regarding the unrelated case.

Andreas Cord-Landwehr und Ralf Petring

December 1, 2017, 4:15 pm, F0.530

Weihnachtsseminar - Algorithmen und Komplexität

Andreas Cord-Landwehr und Ralf Petring werden einen Vortrag im Rahmen des Weihnachtsseminars der Arbeitsgruppe Algorithmen und Komplexität halten. Beide sind ehemalige Mitarbeiter von Prof. Friedhelm Meyer auf der Heide.
Ralf Petring promovierte im Jahr 2014 bei Friedhelm Meyer auf der Heide. Jetzt arbeitet er als Softwareentwickler bei GMS Developent in Paderborn. Er wird einen Überblick über GMS geben und über sein Tätigkeitsfeld berichten.
Andreas Cord-Landwehr promovierte im Jahr 2016 bei Friedhelm Meyer auf der Heide. Jetzt arbeitet er als Softwareentwickler bei dem Landmaschinenkonzern Claas in Harsewinkel. Er wird einen Überblick über Claas geben und über sein Tätigkeitsfeld berichten.

Björn Feldkord

November 29, 2017, 2:00 pm, F1.110

Augmenting Online Facility Location with Movement

The classical Facility Location Problem deals with placement of resources where the costs for servicing clients over shorter distances have to be balanced with the costs for utilizing additional resources. In the online variant, only one client arrives per time step and has to be serviced immediately. Fotakis showed that the competitive ratio of the online variant cannot be independent of the number of clients. The lower bound relies on a construction which requires the utilization of arbitrarily small distances between an optimal facility placement and the placement of an algorithm. We introduce a variant where an online algorithm can make adjustments to its placement for costs proportional to the distance of the rearrangement and discuss strategies and their analysis for this problem.

Sascha Brandt

November 22, 2017, 2:00 pm, F1.110

Towards Surfel Cone Tracing for Real-Time Indirect Illumination

In this talk I present current ideas for using Surfels as basis for indirect illumination. The basic idea is, to utilize the structure of Progressive Blue Surfels, a progressive point-based approximation method for complex 3d scenes, for the lighting computation, by shooting cones in the scene collecting visible surfels that store the lighting information of the scene. This idea is inspired by the Voxel Cone Tracing approach for interactive indirect illumination by Crassin et al. from 2011. The hope is, that a implicit hierarchy defined by the surfels provides a faster and more memory-efficient way to perform the real-time lighting computations and allows for easy progressive refinement.

Steffen Knorr

November 15, 2017, 2:00 pm, F1.110

Global Illumination using Surfel Cone Tracing

Global Illumination has proven to be an important subject for 3D-Rendering. This group of lighting problems permeates all realistic rendering applications and especially indirect lighting is crucial for the believability of a computer generated image. This thesis presents a new algorithm which combines two established techniques for reducing scene complexity and collecting indirect light, namely Surfels and Cone Tracing. The algorithm was implemented using the rendering framework PADrend which produced the images we use for the evaluation. We also discuss the runtime performance and the occurring artifacts.

(Final presentation Master's Thesis)

Arne Kemper

November 8, 2017, 1:15 pm, F2.211

Pure Nash Equilibria in Robust Congestion Games via Potential Functions

Congestion Games are a useful tool to model the bahaviour of players using a shared set of resources. But these models only consider resources that never fail, an assumption that is obviously not always true. We present a class of games that allows an easy modelling of failing resources, while still being potential games. Furthermore we show that these games have the same price of anarchy as traditional congestion games using the same class of delay functions.

(Introductory presentation Master's Thesis)

Vipin Ravindran Vijayalakshmi

November 7, 2017, 2:15 pm, F1.110

Bounding the Inefficiency of Equilibria in Congestion Games under Taxation

Congestion Games belong to a class of games in which a pure Nash equilibrium always exists. However, it is known that the pure Nash equilibrium need not be optimal with respect to the social cost due to the non-cooperative nature of the players in the game. This inefficiency of Nash equilibria is termed as the Price of Anarchy (PoA). We study the effects of economic incentives such as Taxes and their ability to induce desirable behavior amongst players in a Congestion Game. Furthermore, we also study the influence of economic incentives on the Stretch in a Congestion Game. The basic idea involves transforming any Congestion Game G into a game G', where the player disutility is comprised of the cost incurred by the player using a particular resource and the additional tax imposed for using that resource. The PoA of the game G is then analyzed with respect to the equilibrium in the game G'. The focus here is to construct a methodology oblivious to the social optimum that determines the feasible taxes that minimizes the inefficiency of equilibria in Congestion Games in the presence of refundable taxes with Linear and Polynomial cost functions which are strictly non-decreasing and with non-negative coefficients.

(Final presentation Master's Thesis)

Lennart Wachowiak

October 25, 2017, 2:00 pm, F1.110

The Mobile Server Problem in Networks

A growing number of applications is based on services in the cloud. This shift is accompanied by an emergence of large datacenters, where multiple computers take up the space of warehouses. Those computers often work on shared memory or common tasks.
A model for that is given by the Page Migration Problem. This problem deals with a network of nodes with one of the nodes holding a data page. In each timestep a request for the data page occurs at one of the nodes. An online-algorithm can now move the data page to a different node to minimize costs. A similar model is that of the Mobile Server Problem, where the distance the data page can be moved is restricted. So far this was only studied in the Euclidian space.
To better adapt the Mobile Server Problem to the afore mentioned scenario, we discuss the Mobile Server Problem in networks. We give lower bounds on the competitive ratio and analyze the competitiveness of a deterministic algorithm for the problem on trees. Furthermore, we analyze the performance of the same algorithm for different scenarios by performing simulations.

(Final presentation Bachelor's Thesis, talk in German)

Andreas Schönhals, Ansgar Mährlein, Britta Heymann, Duke Ovie Obewhere, Franklin Sidoine Nya Fogue, Marcus Nachtigall, Marius Meyer, Melanie Bruns, Milan Petkovic, Moemmur Shahzad, Nils Weidmann, Oumama Msellek, Robin Oppermann, Sandeep Varma Ganaraju

October 11, 2017, 2:00 pm, F1.110

Ad hoc Networks with Smartphones for Disaster Management Support

In case of an emergency situation or a great disaster, the requirement to coordinate people willing to help can be hindered by the lack of the continuous internet access. To enable communication between mobile devices in these situations, an ad-hoc network which is supposed to operate even without a stable internet infrastructure is needed. In these kinds of networks, challenges like network split, message loss, network overload or draining battery power have to be faced.
We propose a Java-based Android application that connects devices without making assumptions about the network infrastructure. For routing messages in this scenario, the protocol DisRouPT was developed and compared to existing routing protocols by simulating a large-scale disaster scenario in Paderborn using the network simulator The ONE.

(Final presentation of the project group "Ad hoc Networks with Smartphones for Disaster Management Support")

Martin Ziegler
KAIST, School of Computing, Daejeon, South Korea

October 4, 2017, 10:00 am, F1.110

Formal Verification in Imperative Multivalued Programming over Continuous Data Types

Inspired and guided by the iRRAM C++ library (Müller 2001), we formally specify a programming language for the paradigm of EXACT REAL COMPUTATION: reliably operating on encapsulated CONTINUOUS data types such as (not necessarily algebraic) real numbers --- imperatively and exactly (no rounding errors) with primitives computable in the sense of Recursive Analysis and a necessarily modified multivalued=non-extensional semantics of tests. We then propose a complete logical system over two types, integers and real numbers, for rigorous correctness proofs of such algorithms. We extend the rules of Hoare Logic to support the formal derivation of such proofs, and have them verified in the Coq Proof Assistant. Our approach is exemplified with three simple numerical example problems: integer rounding, solving systems of linear equations, and continuous root finding.

(joint work with Gyesik Lee, Norbert Müller, Eike Neumann, Sewon Park, and Norbert Preining)