Loading...

Need to get a college computer science case study academic oxford writing from scratch


How to get a college writing help computer science case study master's double spaced proofreading ama

Project work (38%) The courses listed above are illustrative and may change.A full list of current options is available on the Computer Science website.

Assessment Five take-home exams or written papers Lists of options offered in the 2nd, 3rd and 4th years are illustrative only, and may change from time to time Write my case study computer science original Formatting Chicago/Turabian Master's 11 pages / 3025 words.Assessment Five take-home exams or written papers Lists of options offered in the 2nd, 3rd and 4th years are illustrative only, and may change from time to time.

Further information about all of our courses: /computerscienceatoxford The content and format of this course may change in some circumstances.Read further information about potential course changes 9 Oct 2017 - About 20% of Mathematics and Computer Science graduates tend to go on to further study. Recent graduates secured positions as software and hardware professionals in research, finance and investment analysis, and include a product controller for an international bank, an actuarial consultant and an  .Read further information about potential course changes.A-levels: A*AA with the A* in Mathematics, Further Mathematics or Computing/Computer Science Advanced Highers: AA/AAB Wherever possible, your grades are considered in the context in which they have been achieved 9 Oct 2017 - About 20% of Mathematics and Computer Science graduates tend to go on to further study. Recent graduates secured positions as software and hardware professionals in research, finance and investment analysis, and include a product controller for an international bank, an actuarial consultant and an  .A-levels: A*AA with the A* in Mathematics, Further Mathematics or Computing/Computer Science Advanced Highers: AA/AAB Wherever possible, your grades are considered in the context in which they have been achieved. (See further information on how we use contextual data.

)Candidates are expected to have Mathematics to A-level (A or A* grade), Advanced Higher (A grade), Higher Level in the IB (score 7) or another equivalent.Further Mathematics or another science would also be highly recommended.We expect you to have taken and passed any practical component in your chosen science subjects.All candidates must also take the Mathematics Admissions Test (MAT) as part of their application.

Please see how to apply for further details.

Oxford University is committed to recruiting the best and brightest students from all backgrounds.We offer a generous package of financial support to Home/EU students from lower-income households.(UK nationals living in the UK are usually Home students.) Fees There are no compulsory costs for this course beyond the fees shown above and your living costs.All candidates must follow the application procedure as shown in applying to Oxford.

The information below gives specific details for students applying for this course. For more information on how to apply, including advice on interviews, specimen MAT papers, and sample questions, please see the Computer Science department website.Written test All candidates must take the Mathematics Admissions Test (MAT) in their own school or college or other approved test centre on Thursday 2 November 2017.Candidates must make sure they are available to take the test at this time.Separate registration for this test is required and the final deadline for entries is Sunday 15 October 2017.

It is the responsibility of the candidate to ensure that they are registered for this test.We strongly recommend making the arrangements in plenty of time before the deadline.Further information about all our written tests can be found on our tests page.Details about the MAT can be found on the Maths Aptitude Test website.Written work You do not need to submit any written work as part of an application for this course.

What are tutors looking for? We look for proven mathematical flair, the ability to think and work independently, the capacity to absorb and use new ideas, and enthusiasm.We use these criteria and the MAT results to decide whom to interview.At interview, we explore how you tackle unfamiliar problems and respond to new ideas; we are more interested in how you approach problem-solving than the solution.We don’t require any previous formal qualification in computing, but we do expect a real interest in the subject.Selection criteria Suggested reading Introductory reading for prospective applicants to Computer Science can be found on the departmental website.

You may also like to look at our GeomLab website which will introduce you to some of the most important ideas in computer programming in an interactive, visual way through a guided activity.Kamil 'I love many things about my course.I love the fact that it’s hard, that it’s very theoretical and that we get a lot of practical work.Even when the work is a little challenging you’re never lost because there are so many people around to help you.The tutors really support you at every step and this motivates you to do well.

There are so many wonderful things that I’ve learnt that I never knew existed before.There are definitely moments when, sitting in front of a problem sheet, you realise that you’re at the right place.Computer Science is everything I had hoped for.' Maria She is an IT consultant at CHP Consulting.She says:‘This has been my first job since graduating.

It has allowed me to use the technical skills gained in my degree in a client-facing environment.’ Contextual information The Key Information Sets provide a lot of numbers about the Oxford experience – but there is so much about what you get here that numbers can’t convey.It’s not just the quantity of the Oxford education that you need to consider, there is also the quality – let us tell you more.Oxford’s tutorial system Regular tutorials, which are the responsibility of the colleges, are the focal point of teaching and learning at Oxford.The tutorial system is one of the most distinctive features of an Oxford education: it ensures that students work closely with tutors throughout their undergraduate careers, and offers a learning experience which is second to none.

A typical tutorial is a one-hour meeting between a tutor and one, two, or three students to discuss reading and written work that the students have prepared in advance.It gives students the chance to interact directly with tutors, to engage with them in debate, to exchange ideas and argue, to ask questions, and of course to learn through the discussion of the prepared work.Many tutors are world-leaders in their fields of research, and Oxford undergraduates frequently learn of new discoveries before they are published.Each student also receives teaching in a variety of other ways, depending on the course.This will include lectures and classes, and may include laboratory work and fieldwork.

But the tutorial is the place where all the elements of the course come together and make sense.Meeting regularly with the same tutor – often weekly throughout the term – ensures a high level of individual attention and enables the process of learning and teaching to take place in the context of a student’s individual needs.The tutorial system also offers the sustained commitment of one or more senior academics – as college tutors – to each student’s progress.It helps students to grow in confidence, to develop their skills in analysis and persuasive argument, and to flourish as independent learners and thinkers.The benefits of the college system Every Oxford student is a member of a college.

The college system is at the heart of the Oxford experience, giving students the benefits of belonging to both a large and internationally renowned university and a much smaller, interdisciplinary, college community.Each college brings together academics, undergraduate and postgraduate students, and college staff.The college gives its members the chance to be part of a close and friendly community made up of both leading academics and students from different subjects, year groups, cultures and countries.The relatively small size of each college means that it is easy to make friends and contribute to college life.

There is a sense of belonging, which can be harder to achieve in a larger setting, and a supportive environment for study and all sorts of other activities.

Colleges organise tutorial teaching for their undergraduates, and one or more college tutors will oversee and guide each student’s progress throughout his or her career at Oxford.The college system fosters a sense of community between tutors and students, and among students themselves, allowing for close and supportive personal attention to each student’s academic development.It is the norm that undergraduates live in college accommodation in their first year, and in many cases they will continue to be accommodated by their college for the majority or the entire duration of their course.Colleges invest heavily in providing an extensive range of services for their students, and as well as accommodation colleges provide food, library and IT resources, sports facilities and clubs, drama and music, social spaces and societies, access to travel or project grants, and extensive welfare support.For students the college often becomes the hub of their social, sporting and cultural life.

C The increased relevance of renewable energy sources has modified the behaviour of the electrical grid.Some renewable energy sources affect the network in a distributed manner: whilst each unit has little influence, a large population can have a significant impact on the global network, particularly in the case of synchronised behaviour.This work investigates the behaviour of a large, heterogeneous population of photovoltaic panels connected to the grid.We employ Markov models to represent the aggregated behaviour of the population, while the rest of the network (and its associated consumption) is modelled as a single equivalent generator, accounting for both inertia and frequency regulation.This project will provide extensions of this recent research.In collaboration with an industrial partner.Prerequisites: Computer-Aided Formal Verification, Probabilistic Model Checking C Stochastic Hybrid Systems (SHS) are dynamical models that are employed to characterize the probabilistic evolution of systems with interleaved and interacting continuous and discrete components.Formal analysis, verification, and optimal control of SHS models represent relevant goals because of their theoretical generality and for their applicability to a wealth of studies in the Sciences and in Engineering.In a number of practical instances the presence of a discrete number of continuously operating modes (e.

, in fault-tolerant industrial systems), the effect of uncertainty (e., in safety-critical air-traffic systems), or both occurrences (e., in models of biological entities) advocate the use of a mathematical framework, such as that of SHS, which is structurally predisposed to model such heterogeneous systems.In this project, we plan to investigate and develop new analysis and verification techniques (e., based on abstractions) that are directly applicable to general SHS models, while being computationally scalable.Courses: Computer-Aided Formal Verification, Probabilistic Model Checking, Probability and Computing, Automata Logic and Games Prerequisites: Familiarity with stochastic processes and formal verification C Smart microgrids are small-scale versions of centralized electricity systems, which locally generate, distribute, and regulate the flow of electricity to consumers.

Among other advantages, microgrids have shown positive effects over the reliability of distribution networks.These systems present heterogeneity and complexity coming from 1.local and volatile renewables generation, and 2.the presence of nonlinear dynamics both over continuous and discrete variables.These factors calls for the development of proper quantitative models.

This framework provides the opportunity of employing formal methods to verify properties of the microgrid.The goal of the project is in particular to focus on the energy production via renewables, such as photovoltaic panels.The project can benefit form a paid visit/internship to industrial partners.Courses: Computer-Aided Formal Verification, Probabilistic Model Checking, Probability and Computing, Automata Logic and Games Prerequisites: Familiarity with stochastic processes and formal verification, whereas no specific knowledge of smart grids is needed.B 2017-18C This project is targeted to enhance the software tollbox VeriSiMPL (''very simple''), which has been developed to enable the abstraction of Max-Plus-Linear (MPL) models.

MPL models are specified in MATLAB, and abstracted to Labeled Transition Systems (LTS).The LTS abstraction is formally put in relationship with its MPL counterpart via a (bi)simulation relation.The abstraction procedure runs in MATLAB and leverages sparse representations, fast manipulations based on vector calculus, and optimized data structures such as Difference-Bound Matrices.LTS can be pictorially represented via the Graphviz tool and exported to PROMELA language.This enables the verification of MPL models against temporal specifications within the SPIN model checker.

Courses: Computer-Aided Formal Verification, Numerical Solution of Differential Equations Prerequisites: Some familiarity with dynamical systems, working knowledge of MATLAB and C C Sensorisation and actuation in smart buildings and the development of smart HVAC (heat, ventilation and air-conditioning) control strategies for energy management allow for optimised energy usage, leading to the reduction in power consumption or to optimised demand/response strategies that are key in a rather volatile market.This can further lead to optimised maintenance for the building devices.Of course the sensitisation of buildings leads to heavy requirements on the overall infrastructure: we are interested in devising new approaches towards the concept of using ``humans as sensors''.Further, we plan to investigate approaches to perform meta-sensing, namely to extrapolate the knowledge from physical sensors towards that of virtual elements (as an example, to infer the current building occupancy from correlated measurements of temperature and humidity dynamics).On the actuation side, we are likewise interested in engineering non-invasive minimalistic solutions, which are robust to uncertainty and performance-certified.

The plan for this project is to make the first steps in this direction, based on recent results in the literature.The project can benefit from a visit to Honeywell Labs (Prague).Courses: Computer-Aided Formal Verification.Prerequisites: Some familiarity with dynamical systems.

C This project will explore connections of techniques from machine learning with successful approaches from formal verification.

The project has two sides: a theoretical one, and a more practical one: it will be up to the student to emphasise either of the two sides depending on his/her background and/of interests.The theoretical part will develop existing research, for instance in one of the following two inter-disciplinary domain pairs: learning & repair, or reachability analysis & Bayesian inference.On the other hand, a more practical project will apply the above theoretical connections on a simple models setup in the area of robotics and autonomy.Courses: Computer-Aided Formal Verification, Probabilistic Model Checking, Machine Learning C This project shall investigate a rich research line, recently pursued by a few within the Department of CS, looking at the development of quantitative abstractions of Markovian models.Abstractions come in the form of lumped, aggregated models, which are beneficial in being easier to simulate or to analyse.

Key to the novelty of this work, the proposed abstractions are quantitative in that precise error bounds with the original model can be established.As such, whatever can be shown over the abstract model, can be as well formally discussed over the original one.This project, grounded on existing literature, will pursue (depending on the student's interests) extensions of this recent work, or its implementation as a software tool.Courses: Computer-Aided Formal Verification, Probabilistic Model Checking, Machine Learning C Reinforcement Learning (RL) is a known architecture for synthesising policies for Markov Decision Processes (MDP).We work on extending this paradigm to the synthesis of ‘safe policies’, or more general of policies such that a linear time property is satisfied.

We convert the property into an automaton, then construct a product MDP between the automaton and the original MDP.With this reward function, RL synthesises a policy that satisfies the property: as such, the policy synthesis procedure is `constrained' by the given specification.Additionally, we show that the RL procedure sets up an online value iteration method to calculate the maximum probability of satisfying the given property, at any given state of the MDP.We evaluate the performance of the algorithm on numerous numerical examples.

This project will provide extensions of these novel and recent results.Prerequisites: Computer-Aided Formal Verification, Probabilistic Model Checking, Machine Learning C Stochastic hybrid systems (SHS) are dynamical models for the interaction of continuous and discrete states.The probabilistic evolution of continuous and discrete parts of the system are coupled, which makes analysis and verification of such systems compelling.Among specifications of SHS, probabilistic invariance and reach-avoid have received quite some attention recently.Numerical methods have been developed to compute these two specifications.

These methods are mainly based on the state space partitioning and abstraction of SHS by Markov chains, which are optimal in the sense of reduction in abstraction error with minimum number of Markov states.The goal of the project is to combine codes have been developed for these methods.The student should also design a nice user interface (for the choice of dynamical equations, parameters, and methods, etc.Courses: Probabilistic Model Checking, Probability and Computing, Numerical Solution of Differential Equations Prerequisites: Some familiarity with stochastic processes, working knowledge of MATLAB and C C Pebble games are an important and widely used tool in logic, algorithms and complexity, constraint satisfaction and database theory.

The idea is that we can explore a pair of structures, e.graphs, by placing up to k pebbles on them, so we have a window of size at most k on the two structures.If we can always keep these pebbles in sync so that the two k-sized windows look the same (are isomorphic) then we say that Duplicator has a winning strategy for the k-pebble game.This gives a resource-bounded notion of approximation to graphs and other structures which has a wide range of applications.

Monads and comonads are widely used in functional programming, e.in Haskell, and come originally from category theory.It turns out that pebble games, and similar notions of approximate or local views on data, can be captured elegantly by comonads, and this gives a powerful language for many central notions in constraints, databases and descriptive complexity.For example, k-consistency can be captured in these terms; another important example is treewidth, a key parameter which is very widely used to give “islands of tractability” in otherwise hard problems.

Finally, monads can be used to give various notions of approximate or non-classical solutions to computational problems.These include probabilistic and quantum solutions.For example, there are quantum versions of constraint systems and games which admit quantum solutions when there are no classical solutions, thus demonstrating a “quantum advantage”.Monads and comonads can be used in combination, making use of certain “distributive laws”.The aim of this project is to explore a number of aspects of these ideas.

Depending on the interests and background of the student, different aspects may be emphasised, from functional programming, category theory, logic, algorithms and descriptive complexity, probabilistic and quantum computation.Developing Haskell code for the k-pebbling comonad and various non-classical monads, and using this to give a computational tool-box for various constructions in finite model theory and probabilistic or quantum computation.Developing the category-theoretic ideas involved in combining monads and comonads, and studying some examples.

Using the language of comonads to capture other important combinatorial invariants such as tree-depth.Developing the connections between category theory, finite model theory and descriptive complexity.

Leonid Libkin, Elements of finite model theory.(Background on pebble games and the connection with logic and complexity).The pebbling comonad in finite mode theory.

(Technical report describing the basic ideas which can serve as a starting point.

) B 2017-18C Contextuality is a fundamental feature of quantum physical theories and one that distinguishes it from classical mechanics.In a recent paper by Abramsky and Brandenburger, the categorical notion of sheaves has been used to formalize contextuality.This has resulted in generalizing and extending contextuality to other theories which share some structural properties with quantum mechanics.A consequence of this type of modeling is a succinct logical axiomatization of properties such as non-local correlations and as a result of classical no go theorems such as Bell and Kochen-Soecker.Like quantum mechanics, natural language has contextual features; these have been the subject of much study in distributional models of meaning, originated in the work of Firth and later advanced by Schutze.

These models are based on vector spaces over the semiring of positive reals with an inner product operation.

Need to get a computer science case study premium business us letter size british

The vectors represent meanings of words, based on the contexts in which they often appear, and the inner product measures degrees of word synonymy.Despite their success in modeling word meaning, vector spaces suffer from two major shortcomings: firstly they do not immediately scale up to sentences, and secondly, they cannot, at least not in an intuitive way, provide semantics for logical words such as `and', `or', `not'.Recent work in our group has developed a compositional distributional model of meaning in natural language, which lifts vector space meaning to phrases and sentences awesome-courses - :books: List of awesome university courses for learning Computer Science!   This is the must-have course for everyone in CMU who wants to learn some computer science no matter what major are you in. Because it's CMU (The course number is as same as the zip code of CMU)!; Lecture Notes  .Recent work in our group has developed a compositional distributional model of meaning in natural language, which lifts vector space meaning to phrases and sentences.

This has already led to some very promising experimental results.

However, this approach does not deal so well with the logical words.The goal of this project is to use sheaf theoretic models to provide both a contextual and logical semantics for natural language.We believe that sheaves provide a generalization of the logical Montague semantics of natural language which did very well in modeling logical connectives, but did not account for contextuality.The project will also aim to combine these ideas with those of the distributional approach, leading to an approach which combines the advantages of Montague-style and vector-space semantics.Prerequisites ========== The interested student should have taken the category theory and computational linguistics courses, or be familiar with the contents of these.

Description One of the most successful applications of quantum information science is quantum key distribution, which enables separated parties to send secret messages, with security guaranteed by the laws of quantum theory.The mysterious phenomenon of “quantum nonlocality”, wherein two quantum systems appear to influence one another even though they are separated in space, can be used to design a particularly strong kind of key distribution protocol.The idea is that the honest users do not need to trust that their quantum devices are behaving as advertised, or even that quantum theory is correct.The project will explore the relationship between different kinds of nonlocality and the possibilities for secure communication.A student taking this project should also be taking the Quantum Computer Science course.Some extra reading to cover the basic formalism of quantum theory would be an advantage.Michael Benedikt This project will look at how to find the best plan for a query, given a collection of data sources with access restrictions.We will look at logic-based methods for analyzing query plans, taking into account integrity constraints that may exist on the data.B 2017-18C One recent ambitious project from the Royal Society aims to digitise all transactions of the Royal Society over the years.

As the earliest scientific journal publication in the world, this resource is an invaluable asset to the history of human scientific knowledge.However, given the historical span and sheer volume of this information, being able to uniquely and accurately identify each contributing fellow and their metadata is a real challenge.Furthermore, not all the contributors are Royal Society Fellows, and therefore their unique identity and metadata is not retained in the Royal Society knowledge.Both issues make the digitisation project a real challenge, to achieve high accuracy and coverage.This UG project will work together with the Royal Society, to design a novel algorithm that uses multiple features to align scholarship entities from different datasets, in order to enhance the quality of existing datasets and the coverage of the scholarship knowledgebase from the Royal Society B 2017-18C Smartphone applications often collect personal data and share it with various first and third party entities, for purposes like advertising and analytics.

Using dynamic traffic analysis techniques, we have mapped many of the third-party data flows, including particular data types, from popular applications.The aim of this project would be to extend this work by deploying a mobile application that automatically reveals to the user what kinds of data are sent to whom via the apps they installed on their device.Alternatively, the project could look into extending our existing traffic analysis framework in order to scale up the analysis and improve the accuracy and coverage of the tracker detection.C "In domains such as manufacturing there may be a large number of individual steps required to complete an overall task, with various constraints between the steps and finite availability of resources.For example, an aircraft may require hundreds of thousands of steps to build, with constraints like ""we cannot mount the engines before the wings"", and resources like the number of workers and pieces of key machinery.

Scheduling software exists that takes the lists of steps, constraints, and resources and generates feasible schedules; that is, produces lists of which steps should be performed at what times.Given the complexity of the problem it is impractical to generate optimal schedules, but in general close to optimal schedules ('good schedules') can be generated in a reasonable time.However the choice of which good schedule to use is often determined by factors that are not known early in the process or are difficult to quantify, such as the layout of a factory or the temporary loss of a worker due to illness.The goal of this project is to take an existing scheduling program and a class of real-life industrial problems and to develop a simulation program that could assist a process engineer to visualise the similarities and differences between a small number of good schedules, and hence to interactively adjust the scheduling parameters in order to improve the schedule.

For example the set of feasible schedules can be displayed as a graph with the steps as graph nodes, and a visualisation might show the progress of the schedule as annotations on the graph nodes, whilst providing a set of on-screen controls to adjust the scheduling parameters.

The scheduling program itself is written in C++, but this does not constrain the simulation program to be written in a particular language.The skill-set required of a student taking this project would then be a mixture of two-dimensional graphics, plus a desire to find out more about two-dimensional animation and graphical user interface design.There is also the option of applying techniques from machine learning in order to automatically improve the schedule quality.The scheduling scenario to be used as an example in this project will be provided by an Oxford-based company who are interested in potentially applying these techniques in the future." C "In domains such as manufacturing there may be a large number of individual steps required to complete an overall task, with various constraints between the steps and finite availability of resources.For example, an aircraft may require hundreds of thousands of steps to build, with constraints like ""we cannot mount the engines before the wings"", and resources like the number of workers and pieces of key machinery.Scheduling software exists that takes the lists of steps, constraints, and resources and generates feasible schedules; that is, produces lists of which steps should be performed at what times.Given the complexity of the problem it is impractical to generate optimal schedules, but in general close to optimal schedules ('good schedules') can be generated in a reasonable time.

However the choice of which good schedule to use is often determined by factors that are not known early in the process or are difficult to quantify, such as the layout of a factory or the temporary loss of a worker due to illness.The goal of this project is to take an existing scheduling program and a class of real-life industrial problems and to develop a visualisation program that could help an end-user picture the running of a particular schedule as a three-dimensional animation.The scheduling program itself is written in C++, but this does not constrain the visualisation program to be written in a particular language.The skill-set required of a student taking this project would primarily be in three-dimensional animation, either using their own code or by bolting onto an existing animation tool such as Blender.It should also be possible for the end-user to easily vary the context of the visualisation (i.

, the placement of the equipment in the three-dimensional world and its relationship the the output of the scheduling program).The scheduling scenario to be used as an example in this project will be provided by an Oxford-based company who are interested in potentially applying these techniques in the future." C Data integration systems allow users to effectively access data sitting in multiple datasources (typically relational databases) by means of queries over a global schema.

In practice, datasources often contain sensitive information that the data owners want to keep inaccessible to users.In a recent research paper, the project supervisors have formalized and studied the problem of determining whether a given data integration system discloses sensitive information to an attacker.The paper studies the computational properties of the relevant problems and also identifies situations in which practical implementations are feasible.The goal of the project is to design and implement practical algorithms for checking whether information disclosure can occur in a data integration setting.These algorithms would be applicable to the aforementioned situations for which practical implementations seem feasible.

Prerequisites: Familiarity with Databases.The students would also benefit from taking the Knowledge Representation and Reasoning Course and/or Theory of Data and Knowledge Bases.B 2017-18C Description: suppose that people living in a given geographic area would like to decide where to place a certain facility (say, a library or a gas station).There are several possible locations, and each person prefers to have the facility as close to them as possible.However, the central planner, who will make the final decision, does not know the voters' location, and, moreover, for various reasons (such as privacy or the design of the voting machine), the voters cannot communicate their location.

Instead, they communicate their ranking over the available locations, ranking them from the closest one to the one that is furthest away.The central planner then applies a fixed voting rule to these rankings.The quality of each location is determined by the sum of distances from it to all voters, or alternatively the maximum distance.A research question that has recently received a substantial amount of attention is whether classic voting rules tend to produce good-quality solutions in this setting.The goal of the project would be to empirically evaluate various voting rules with respect to this measure, both for single-winner rules and for multi-winner rules (where more than one facility can be opened).

In addition to purely empirical work, there are interesting theoretical questions that one could explore, such as proving worst-case upper and lower bounds of the performance of various rules.Prerequisites: basic programming skills.B 2017-18C (Supervisor C Schallhart) Web pages are the past since interactive web application interfaces have reshaped the online world.With all their feature richness, they enrich our personal online experience and provide some great new challenges for research.In particular, forms became much complex in assisting the user during the lling, e.

, with completion options, or through structuring the form lling process by dynam-ically enabling or hiding form elements.Such forms are a very interesting research topic but their complexity prevented so far the establishment of a corpus of modern forms to benchmark di erent tools dealing with forms automatically.This MSC project will ll this gap in building a corpus of such forms: Based on a number of production sites from one or two domains, we will build our corpus of web interfaces, connected to a (shared) database.Not only will the future evaluations in the DIADEM project rely on this corpus, but we will also publish the corpus promoting it as a common benchmark for the research community working on forms.

Knowledge in Java, HTML, CSS, Javascript, and web application development are required.B 2017-18C (Joint with C Schallhart) Unearthing the knowledge hidden in queryable web sites requires a good understanding of the involved forms.As part of DIADEM, we are developing OPAL (Ontology based web Pattern Analysis with Logic), a tool to recognize forms belonging to a parameterizable application domain, such as the real estate or used car market.OPAL determines the meaning of individual form elements, e.

, it identi es the eld for the minimum or maximum price or for some location.This MSC project will build upon OPAL to not only deal with static forms but also with sequences of interrelated forms, as in case of a rough initial form, followed by a re nement form, or in case of forms showing some options only after lling some other parts.Over the course of this MSC project, we will develop a tool which invokes OPAL to analyze a given form, to explore all available submission mechanisms on this form, analyze the resulting pages for forms continuing the initial query, and to combine the outcome all found forms into a single interaction description.Knowledge in Java, HTML, CSS are required, prior experience in logic programming would be a strong plus.B 2017-18C Bayesian deep learning (BDL) is a field of Machine Learning where we develop tools that can reason about their confidence in their predictions.

A main challenge in BDL is comparing different tools to each other, with common benchmarks being much needed in the field.In this project we will develop a set of tools to evaluate Bayesian deep learning techniques, reproduce common techniques in BDL, and evaluate them with the developed tools.The tools we will develop will rely on downstream tasks that have made use of BDL in real-world applications such as parameter estimation in Strong Gravitational Lensing with neural networks.Prerequisites: only suitable for someone who has done Probability Theory, has worked in Machine Learning in the past, and has strong programming skills (Python).B 2017-18C Bayesian deep learning (BDL) is a field of Machine Learning where we develop tools that can reason about their confidence in their predictions.

A main challenge in BDL is comparing different tools to each other, with common benchmarks being much needed in the field.In this project we will develop a set of tools to evaluate Bayesian deep learning techniques, reproduce common techniques in BDL, and evaluate them with the developed tools.The tools we will develop will rely on downstream tasks that have made use of BDL in real-world applications such as detecting diabetic retinopathy from fundus photos and referring the most uncertain decisions for further inspection.Prerequisites: only suitable for someone who has done Probability Theory, has worked in Machine Learning in the past, and has strong programming skills (Python).B 2017-18C Reinforcement learning (RL) algorithms often require large amounts of data for training, data which is often collected from simulations of experiments of robotics systems.

The requirement for large amounts of data forms a major hurdle in using RL algorithms for tasks in robotics though, where each real-world experiment would cost time and potential damage to the robot.In this project we will develop a mock "Challenge" similar to Kaggle challenges.In this challenge we will restrict the amount of data a user can query the system at each point in time, and try to implement simple RL baselines under this constraint.We will inspect the challenge definition and try to improve it.Prerequisites: only suitable for someone who has worked in Machine Learning in the past, is familiar with Reinforcement Learning, and has strong programming skills (Python).

C "Time series data arise as the output of a wide range of scientific experiments and clinical monitoring techniques.Typically the system under study will either be undergoing time varying changes which can be recorded, or the system will have a time varying signal as input and the response signal will be recorded.Such recordings contain valuable information about the underlying system under study, and gaining insight into the behaviour of that system typically involves building a mathematical or computational model of that system which will have embedded within in key parameters governing system behaviour.The problem that we are interested in is inferring the values of these key parameter through applications of techniques from machine learning and data science.

Currently used methods include Bayesian inference (Markov Chain Monte Carlo (MCMC), Approximate Bayesian Computation (ABC)), and non-linear optimisation techniques, although we are also exploring the use of other techniques such as probabilistic programming and Bayesian deep learning.We are also interested in developing techniques that will speed up these algorithms including parallelisation, and the use of Gaussian Process emulators of the underlying models Application domains of current interestinclude modelling of the cardiac cell (for assessing the toxicity of new drugs), understanding how biological enzymes work (for application in developing novel fuel cells), as well as a range of basic science problems.Application domains of current interest include modelling of the cardiac cell (for assessing the toxicity of new drugs), understanding how biological enzymes work (for application in developing novel fuel cells), as well as a range of basic science problems." Prerequisites: some knowledge of Python C Description: I-cut-you-choose is the classical way for two people to share a divisible good.For three people, there exists a sequence of operations using 5 cuts, that is also envy-free, but for 4 or more people, it is unknown whether you can share in an envy-free manner, using a finite number of cuts.

(This is with respect to a well-known class of procedures that can be represented using a tree whose nodes are labelled with basic "cut" and "choose" operations.) The general idea of this project is to generate and test large numbers of potential cake-cutting procedures, and measure the "extent of envy" in the case where they are not envy-free.(See wikipedia's page on "cake-cutting problem".) It is of interest to find out the amount of envy that must exist in relatively simple procedures.Prerequisites: competance and enthusiasm for program design and implementation; mathematical analysis and proofs.

C The Product-Mix Auction was devised by Klemperer in 2008 for the purpose of providing liquidity to commercial banks during the financial crisis; it was used for a number of years by the Bank of England.The project investigates an extension to the original auction that allows buyers more flexibility to express their requirements.This extension, allowing "negative bids" to be made, allows a buyer to express any "strong substitutes" demand function.In this extension, the search for prices has the form of a sub modular minimisation problem, and the project envisages applying algorithms such as Fugishige-Wolfe to this challenge.

We envisage applying algorithms to simulated data, and obtaining experimental results about their runtime complexity.We also envisage testing local-search heuristics.B 2017-18C Description: The aim of the project is to allow a user to construct a finite automaton (or alternatively, a regular expression) by providing examples of strings that ought to be accepted by it, in addition to examples that ought not to be accepted.An initial approach would be to test simple finite automata against the strings provided by the user; more sophisticated approaches could be tried out subsequently.One possibility would be to implement an algorithm proposed in a well-known paper by Angluin, "Learning regular sets from queries and counterexamples".

Prerequisites: competance and enthusiasm for program design and implementation; familiarity with finite automata, context-free langauges etc.B 2017-18C Description: In social choice theory, a general theme is to take a set of rankings of a set of candidates (also known as alternatives) and compile an "overall" ranking that attempts to be as close as possible to the individual rankings.Each individual ranking can be thought of as a vote that we want to compile into an overall decision.

The project involves taking some real-world data, for example university league tables, and computing aggregate rankings to see which individual votes are closest to the consensus.

In the case of the Kemeny consensus, which is an NP-complete rank aggregation rule, it is of interest to exploit heuristics that may be effective on real-world data, and see for how large a data set can the Kemeny consensus be computed.Prerequisites: familiarity with polynomial-time algorithms, NP-hardness; interest in computational experiments on data C In the well-known game of rock-paper-scissors, it is clear that any player can "break even" by playing entirely at random.On the other hand, people do a poor job of generating random numbers, and expert players of the game can take advantage of predictable aspects of opponents' behaviour.In this project, we envisage designing algorithms that adapt to human opponent's behaviour, using for example no-regret learning techniques, and modelling the opponent as a probabilistic automaton.Ideally, the student taking this project should manage to persuade some volunteers to test the software! This is needed to provide data for the algorithms, and should provide feedback to the opponent on what mistakes they are making.

Depending on how it goes, we could also consider extending this approach to related games.Prerequisite: Interest in working with probability is important.B 2017-18C Privacy is not a binary concept, the level of privacy enjoyed by an individual or organisation will depend upon the context within which it is being considered; the more data at attacker has access to the more potential there may be for privacy compromise.We lack a model which considers the different contexts that exist in current systems, which would underpin a measurement system for determining the level of privacy risk that might be faced.This project would seek to develop a prototype model – based on a survey of known privacy breaches and common practices in data sharing.

The objective being to propose a method by which privacy risk might be considered taking into consideration the variety of (threat and data-sharing) contexts that any particular person or organisation might be subjected to.It is likely that a consideration of the differences and similarities of the individual or organisational points of view will need to be made, since the nature of contexts faced could be quite diverse.B 2017-18C Computer Vision allows machines to recognise objects in real-world footage.In principle, this allows machines to flag potential threats in an automated fashion based on historical and present images in video footage of real-world environments.Automated threat detection mechanisms to help security guards identify threats would be of tremendous help to them, especially if they have to temporarily leave their post, or have to guard a significant number of areas.

In this project, students are asked to implement a system that is able to observe a real environment over time and attempt to identify potential threats, independent of a number of factors, e.The student is encouraged to approach this challenge as they see fit, but would be expected to design, implement and assess any methods they develop.One approach might be to implement a camera system using e.

a web camera or a Microsoft Kinect to conduct anomaly detection on real-world environments, and flag any issues related to potential threats.Requirements: Programming skills required B 2017-18C The behavior of controls will be dependent upon how they are used.One obvious example being the protection afforded by a firewall is dependent upon the maintenance of the rules that determine what the firewall stops and what it does not.The benefit of various technical controls in operational context lacks good evidence and data, so there is scope to consider the performance of controls in a lab environment.

This mini-project would select one or more controls from the CIS Top 20 Critical Security Controls (CSC) (version 6.1) and seek to develop laboratory experiments (and implement them) to gather data on how the effectiveness of the control is impacted by its deployment context (including, for example, configuration, dependence on other controls, threat faced).Requirements: Students will need an ability to develop a test-suite and deploy the selected controls.B 2017-18C While these controls might be advantageous in many regards, they are often up against changing regulatory frameworks which place differing demands with respect to information security and privacy.One potential challenge is where these security control sets suggest best practice that is not in-line with regulatory requirements in the company’s industry or jurisdiction.

This would lead to companies following key standards but failing to reach requisite levels of security.This is especially the case where standards suggest that certain controls are optional, when actually, they may be critical for a certain locale.The aim of this project will be to consider these issues with special emphasis on the CIS Top 20 Critical Security Controls (CSC) (version 6.1), and their context of use within Europe – particularly with the existing Data Protection Act and upcoming General Data Protection Regulation.1 is of interest given that it is structured to have ‘foundational’ and ‘advanced’ controls, which allow companies flexibility that might not actually be afforded with current regulation in mind.B 2017-18C Cybersecurity visualization helps analysts and risk owners alike to make better decisions about what to do when the network is attacked.In this project the student will develop novel cybersecurity visualizations.

Where to buy a computer science case study writing from scratch american cbe platinum

The student is free to approach the challenge as they see fit, but would be expected to design, implement and assess the visualizations they develop.These projects tend to have a focus on network traffic visualization, but the student is encouraged to visualize datasets they would be most interested in.

Other interesting topics might for instance include: host-based activities (e From high school to college: Student perspectives on literacy practices. Journal of   Negotiating knowledge contribution to multiple discourse communities: A doctoral student of computer science writing for publication.   Genre and disciplinary competence: A case study of contextualisation in an academic speech genre..Other interesting topics might for instance include: host-based activities (e.

CPU/RAM/network usage statistics), network scans (incl.Past students have visualized network traffic patterns and android permission violation patterns Some renewable energy sources affect the network in a distributed manner: whilst each unit has little influence, a large population can have a significant impact on the global network, particularly in the case of synchronised behaviour. This work investigates the behaviour of a large, heterogeneous population of photovoltaic  .

Past students have visualized network traffic patterns and android permission violation patterns.

Other projects on visualizations are possible (not just cybersecurity), depending on interest and inspiration Some renewable energy sources affect the network in a distributed manner: whilst each unit has little influence, a large population can have a significant impact on the global network, particularly in the case of synchronised behaviour. This work investigates the behaviour of a large, heterogeneous population of photovoltaic  .Other projects on visualizations are possible (not just cybersecurity), depending on interest and inspiration.Requirements: Programming skills required.B 2017-18C This project would seek to study the general form of distributed ledgers, and the claimed nuances and general for in implementations, and assess all the possible week points that might make implementations open to compromise best websites to buy a components technology research proposal Academic Rewriting 20 days.B 2017-18C This project would seek to study the general form of distributed ledgers, and the claimed nuances and general for in implementations, and assess all the possible week points that might make implementations open to compromise.The general approach will be to develop a detailed understanding of the security requirements and inter-dependencies of functionality – capturing the general security case for a distributed ledger and how it decomposes into lower level security requirements.

Then an assessment will be made of each, and the potential for vulnerabilities in design and implementation considered.

The ultimate result being a broad analysis of potential weak-points.If possible these will then be practically investigated in a lab-based environment.One output might be a proposal for testing strategies.B 2017-18C This project would utilise the process algebra CSP and associated model checker FDR to explore various types of threat and how they might successfully compromise a distributed ledger.This type of modelling would indicate possible attacks on a distributed ledger, and could guide subsequent review of actual designs and testing strategies for implementations.

The modelling approach would be based on the crypto-protocol analysis techniques already developed for this modelling and analysis environment, and would seek to replicate the approach for a distributed ledger system.Novel work would include developing the model of the distributed ledger, considering which components are important, formulating various attacker models and also formulating the security requirements / properties to be assessed using this model-checking based approach.Requirements: In order to succeed in this project students would need to have a working knowledge of the machine readable semantics of CSP and also the model checker FDR.An appreciation of threat models and capability will need to be developed.B 2017-18C High dynamic range imaging (HDRI) allows more accurate information about light to be captured, stored, processed and displayed to observers.

In principle, this allows viewers to obtain more accurate representations of real-world environments and objects.Naturally, HDRI would be of interest to museum curators to document their objects, particularly, non-opaque objects or whose appearance significantly alter dependent on amount of lighting in the environment.Currently, few tools exist that aid curators, archaeologists and art historians to study objects under user-defined parameters to study those object surfaces in meaningful ways.In this project the student is free to approach the challenge as they see fit, but would be expected to design, implement and assess any tools and techniques they develop.The student will then develop techniques to study these objects under user-specified conditions to enable curators and researchers study the surfaces of these objects in novel ways.

These methods may involve tone mapping or other modifications of light exponents to view objects under non-natural viewing conditions to have surface details stand out in ways that are meaningful to curators.B 2017-18C High dynamic range imaging (HDRI) allows more accurate information about light to be captured, stored, processed and displayed to observers.In principle, this allows viewers to obtain more accurate representations of real-world environments.Naturally, HDRI would be of interest to security personnel who use Closed-Circuit Television (CCTVs) to identify potential threats or review security footage.Conceivably, it may be possible to identify threats or activities in shadows or overexposed areas.

Another example being able to identify facial features better of someone who is wearing a hoodie.The student is encouraged to approach this challenge as they see fit, but would be expected to design, implement and assess any methods they develop.One approach might be to implement an HDR viewer, then conduct a user study and in which participants attempt to identify how well low dynamic range content viewing compares to HDR viewing set in a physical security context.Another approach might be to implement an HDR viewer that changes exposures based on where the viewer is looking.We have eyetrackers at our disposable that students would be able to use as part of their assessment.

We will also be able to provide training for the student, so they are able to use the eyetracking tools themselves.Requirements: Programming skills required B 2017-18C Motion-gesture peripherals have become popular in recent years, with Leap motion, Myo and Kinect being a few examples.In this project the student will develop an application to use such peripherals.The student is free to approach the challenge as they see fit, but would be expected to design, implement and assess the tools they develop.The tool would also need to serve a meaningful purpose.

These projects tend to have a strong focus on human-computer interaction elements, particularly on designing and implementing user-friendly and meaningful motion gestures for a variety of real-world applications.Past students for instance have developed support for Myo on Android devices (so one does not have to touch a tablet screen while cooking) as well as added leap motion and Kinect support to cyber security visualization tools.Suitable for 3rd or 4th year undergraduates, or MSc.Other projects on novel human-computer interfaces possible, depending on interest and inspiration.B 2017-18C There is a large investment being made by the international community aimed at helping nations and regions to develop their capacity in cybersecurity.

There is scope to study in more detail the global trends in capacity building in cybersecurity, the nature of the work and the partnerships that exist to support it.An interesting analysis might be to identify what is missing (through comparison with the Cybersecurity Capacity Maturity Model, a key output of the Centre), and also to consider how strategic, or not, such activities appear to be.An extension of this project, or indeed a second parallel project, might seek to perform a comparison of the existing efforts with the economic and technology metrics that exist for countries around the world, exploring if the data shows any relationships exist between those metrics and the capacity building activities underway.This analysis would involve regression techniques.

B 2017-18C Current penetration testing is typically utilised for discovering how organisations might be vulnerable to external hacks, and testing methods are driven by using techniques determined to be similar to approaches used by hackers.The result being a report highlighting various exploitable weak-points and how they might result in unauthorised access should a malign entity attempt to gain access to a system.Recent research within the cybersecurity analytics group has been studying the relationship between these kinds of attack surfaces and the kinds of harm that an organisation might be exposed to.An interesting question would be whether an orientation around intent, or harm, might result in a different test strategy; would a different focus be given to the kinds of attack vectors explored in a test if a particular harm is aimed at.This mini-project would aim to explore this question by designing penetration test strategies based on a set of particular harms, and then seek to consider potential differences with current penetration practices by consultation with the professional community.

Requirements: Students will need to have a working understanding of penetration testing techniques.B 2017-18C Prior research has been considering how we might better understand and predict the consequences of cyber-attacks based on knowledge of the business processes, people and tasks and how they utilise the information infrastructure / digital assets that might be exposed to specific attack vectors.However, this can clearly be refined by moving to an understanding of those tasks live or active at the time of an attack propagating across a system.If this can be calculated, then an accurate model of where risk may manifest and the harm that may result can be constructed.

This project would explore the potential for such a model through practical experimentation and development of software monitors to be placed on a network aimed at inferring the tasks and users that are active based from network traffic.

If time allows then host-based sensors might also be explored (such as on an application server) to further refine the understanding of which users and live on which applications etc.Requirements: Students must be able to construct software prototypes and have a working knowledge of network architectures and computer systems.B 2017-18C Procedural methods in computer graphics help us develop content for virtual environments (geometry and materials) using formal grammars.Common approaches include fractals and l-systems.Examples of content may include the creation of cities, planets or buildings.

In this project the student will develop an application to use create content procedurally.The student is free to approach the challenge as they see fit, but would be expected to design, implement and assess the methods they develop.These projects tend to have a strong focus designing and implementing existing procedural methods, but also includes a portion of creativity.The project can be based on reality - e.looking at developing content that has some kind of basis on how the real world equivalent objects were created (physically-based approaches), or the project may be entirely creative in how it creates content.Past students for instance have built tools to generate cities based on real world examples and non-existent city landscapes, another example include building of procedural planets, including asteroids and earth-like planetary bodies.B 2017-18C Reflectance Transformation Imaging (RTI) is a powerful set of techniques (the first of which known as Polynomial Texture Maps, PTMs) that enables us to capture photographs of objects under a several lighting conditions.Combined, these RTI images form a single photograph in which users can relight these objects by moving the light sources around the hemisphere in front of the object, but also specify user-defined parameters, including removing colour, making the objects more specular or diffuse in order to investigate the surface details in depth.It can be used for forensic investigation of crime scenes as well as cultural heritage documentation and investigation.

The purpose of this project is to implement RTI methods of their preference.B 2017-18C (Joint with Sadie Creese) Smartphone security: one concrete idea is the development of a policy language to allow the authors of apps to describe their behaviour, designed to be precise about the expected access to peripherals and networks and the purpose thereof (data required and usage); uses skills in formal specification, understanding of app behaviour (by studying open-source apps), possibly leading to prototyping a software tool to perform run-time checking that the claimed restrictions are adhered to.Suitable for good 3rd or 4th year undergraduates, or MSc, Concurrency, Concurrent Programming, Computer Security all possibly an advantage.Other projects within this domain possible, according to interest and inspiration.Prerequisites: B 2017-18C At Oxford we have developed a framework for understanding the components of an attack, and documenting known attack patterns can be instrumental in developed trip-wires aimed at detecting the presence of insiders in a system.

This project will seek to develop a library of such trip-wires based on a survey of openly documented and commented upon attacks, using the Oxford framework.There will be an opportunity to deploy the library into a live-trial context which should afford an opportunity to study the relative utility of the trip-wires within large commercial enterprises.The mini-project would also need to include experimenting with the trip-wires in a laboratory environment, and this would involve the design of appropriate test methods B 2017-18C There are many tools available for detecting and monitoring cyber-attacks based on network traffic and these are accompanied by a wide variety of tools designed to make alerts tangible to security analysts.By comparison, the impact of these attacks on an organisational level has received little attention.An aspect that could be enhanced further is the addition of a tool facilitating management and updating of our understanding of business processes, but also how those processes are dependent on a network infrastructure.

This tool could facilitate the mapping between company strategies, activities needed to accomplish company goals and map these down to the network and people assets.At the top of the hierarchy lies the board, responsible for strategic decisions.These decision are interpreted in the managerial level and could be captured and analysed with business objective diagrams.These diagrams in return could be refined further to derive business processes and organisational charts, ensuring that decision made in the top level will be enforced in the lower levels.The combination of business processes and organisation charts could eventually provide the network infrastructure.

For this project we suggest a student could develop novel algorithms for mapping of business processes to network infrastructures in an automated way (given the updated business process files).That said, the student is encouraged to approach this challenge as they see fit, but would be expected to design, implement and assess any methods they develop.Other projects on business process modelling also possible, depending on interest and inspiration.C Nash equilibrium is the standard solution concept for multi-player games.Such games have multiple applications in Logic and Semantics, Artificial Intelligence and Multi-Agent Systems, and Verification and Computer Science.

Unfortunately, Nash equilibria is not preserved under bisimilarity, one of the most important behavioural equivalences for concurrent systems.In a recent paper it was shown that this problem may not arise when certain models of strategies are considered.In this project the aim is to investigate further implications of considering the new model of strategies.For instance, whether a number of logics for strategic reasoning become invariant under bisimilarity if the new model of strategies is considered, whether some logics that are unable to express Nash equilibria can do so with respect to the new model of strategies, and whether the results already obtained still hold more complex classes of systems, for instance, where nondeterminism has to be considered.Prerequisites: Discrete Mathematics, Introduction to Formal Proof, Logic and Proof Desirable: Computer-Aided Formal Verification, Computational Complexity Project type: theory C Strategy Logic (SL) is a temporal logic to reason about strategies in multi-player games.

Such games have multiple applications in Logic and Semantics, Artificial Intelligence and Multi-Agent Systems, and Verification and Computer Science.SL is a very powerful logic for strategic reasoning -- for instance, Nash equilibria and many other solution concepts in game theory can be easily expressed -- which has an undecidable satisfiability problem and a non-elementary model checking problem.In this project, the aim is to study fragments of SL which can potentially have better results (decidability and complexitity) with respect to the satisfiability and model checking problems.The fragments to be studied can be either syntactic fragments of the full language or semantic fragments where only particular classes of models are considered.Prerequisites: Discrete Mathematics, Introduction to Formal Proof, Logic and Proof Desirable: Computer-Aided Formal Verification, Computational Complexity Project type: theory B 2017-18C Given a homogeneous system of linear equations A x = 0, a Hilbert basis is a unique finite minimal set of non-negative solutions from which every non-negative solution of the system can be generated.

Computing Hilbert bases is a fundamental problem encountered in various areas in computer science and mathematics, for instance in decision procedures for arithmetic theories, the verification of infinite-state systems and pure combinatorics.we plan to revisit an approach to computing Hilbert bases described in 1 that is highly parallelizable.With the ubiquity of multi-core architectures, it seems conceivable that, with proper engineering, this approach will outperform existing approaches.

The goal of this project is to deliver an implementation of the algorithm of 1 and benchmark it against competing tools.

If successful, an integration into the SageMath platform could be envisioned.Though widely used for modelling neural networks, TensorFlow actually allows for expressing any computation that can be represented as a data flow graph.In computational complexity, such graphs have been studied for a long time in the context of arithmetic circuits.The combination of techniques from circuit complexity theory as well as number theory have in recent years led to algorithms that allow for evaluating arithmetic circuits more efficiently than standard methods.

The goal of this project is to investigate whether we can apply those techniques in order to efficiently evaluate (subclasses of) dataflow graphs present in TensorFlow.To this end, we plan to develop a formal model of TensorFlow data flow graphs, and to analyse the computational complexity of evaluating such graphs and computing gradients.Prerequisites: Computational Complexity and a robust background in mathematics B 2017-18C The goal of this project is to determine the computational complexity of the first-order theory of the integers with addition and equality FO(Z,+,=), but without order.This theory is a strict fragment of Presburger arithmetic, an arithmetic theory that finds numerous applications, for instance, in the verification of infinite-state systems.We aim for making an original scientific contribution by determining what time and space resources are needed in order to decide a sentence in FO(Z,+,=).

If time permits, an implementation of the decision procedure developed in this project could be envisioned.Prerequisites: good familiarity with first-order logic, linear algebra and computational complexity B 2017-18C This is a project in the specification of hardware, which I expect to make use of functional programming.There is a great deal of knowledge about ways (that are good by various measures) of implementing standard arithmetic operations in hardware.However, most presentations of these circuits are at a very low level, involving examples, diagrams, and many subscripts, for example these descriptions.The aim of this project is to describe circuits like this in a higher-level way by using the higher order functions of functional programming to represent the structure of the circuit.

It should certainly be possible to execute instances of the descriptions as simulations of circuits (by plugging in simulations of the component gates), but the same descriptions might well be used to generate circuit netlists for particular instances of the circuit, and even to produce the diagrams.B 2017-18C The aim is to take some mathematics that would be within the grasp of a mathematically-inclined sixth-former and turn it into some attention-grabbing web pages.Ideally this reveals a connection with computing science.I imagine that this project necessarily involves some sort of animation, and I have visions of Open University television maths lectures.The programming need not be the most important part of this project, though, because some of the work is in choosing a topic and designing problems and puzzles and the like around it.

There's a lot of this sort of thing about, though, so it would be necessary to be a bit original.Think of GeomLab (and then perhaps think of something a little less ambitious).It might involve logic and proof, it might be about sequences and series, it might be about graphs, it might be about the mathematics of cryptography.C "Background: Many reasoning tasks for expressive ontology languages such as OWL 2 DL can be reduced to or approximated by rule-based reasoning in formalisms such as OWL 2 RL, Datalog, or Disjunctive Datalog.

A key step of any such reduction is a structural transformation of the original ontology into a set of rules in the target formalism that are in a certain sense equivalent to the ontology.The goal of the project is to implement an algorithm for transforming OWL 2 DL ontologies into rules that can then serve as input to Datalog and OWL 2 RL reasoners, as well as ASP solvers.Prereaquisites: * Good programming skills in Java *The following courses are relevant for this project: - Logic and Proof - Knowledge Representation and Reasoning " B 2017-18C One of the interesting approaches to reducing overfitting in neural networks is to add noise to the inputs and activations before performing a gradient step.The key insight is that this noise injection prevents the learnt weights from being too delicately balanced to fit the data; some kind of robustness is necessary to fit noisy data.Another interesting consequence of noisy data is that recent work shows that learning algorithms using noisy data may be better at protecting privacy of the data.

Thus, there may be twin advantages to this approach.This project will involve understanding the backgroud in this topic, performing simulations to understand the behaviour and hopefully developing new theories.This project may involve collaboration with Mr.B 2017-18C Evolution is basically a form of learning, where the search happens through variation caused by mutations, recombination and other factors, and (natural) selection is a feedback mechanism.There has been much recent work in understanding evolution through a computational lens.One of the fundamental building blocks of life is circuits where the production of protein is controlled by other ones (transcription factors); these circuits are known as transcription networks.Mathematical models of transcription networks have been proposed using continuous-time Markov processes.The focus of the project is to use these models to understand the expressive power of these networks and whether simple evolutionary algorithms, through suitably guided selection, can result in complex expressive patterns.

The work will involve both simulations and theory.B 2017-18C With the upcoming French elections and the mixed performance of polls in recent months in mind, the goal of this project will be to see to what extent fundamentals, such as economic indicators, demographic indicators, incumbents, etc.and the use of social media such as twitter, facebook etc.can be used to predict election results.The project will involve a survey of past methods used, as well as data collection and model fitting.

This project is open-ended (and hence potentially risky); the main aim is to get good results using historical and current data.It would be helpful if the student has good programming experience as wel as knowledge of various different machine learning techniques.The project will involve collabortion with Dr.Vincent Cohen-Addad (Copenhagen) B 2017-18C Steganography means hiding a hidden payload within an apparently-innocent cover, usually an item of digital media (in this project: images).

Steganalysis is the art of detecting that hiding took place.

A key question is how the amount of information that can be securely hidden (i.such that detectors have a high error rate) scales with the size of the cover.In 2008 I co-authored a paper showing that my theoretical "square root law" was observed experimentally, using state-of-the-art (for 2008) hiding and detection methods.This project is to run similar experiments using methods 10 years more modern.

It would involve combining off-the-shelf code (some in MATLAB, some in Python) from various researchers and running fairly large scale experiments to measure detection accuracy versus cover size and payload size in tens of thousands of images, then graphing the results suitably.

Mathematics and computer science university of oxford

Prerequisites: No particular prerequisites.Ability to piece together others' code and draw graphs nicely.B 2017-18C Steganography means hiding a hidden payload within an apparently-innocent cover, usually an item of digital media   of London, Imperial College. The researchers concluded: “We indicate the need for learning adaptive behaviour from users and the importance of formal methods within the engineering design process.” Sloman and colleagues published their study in Computer Journal (Engineering Policy-Based Ubiquitous Systems..B 2017-18C Steganography means hiding a hidden payload within an apparently-innocent cover, usually an item of digital media.

This project is to implement a transparent proxy which uses something like a webcam video stream to hide steganographic packets.

A local process should receive communication on a local socket and merge it with the webcam data stream, a process which can be reversed at the receiver.Prerequisites: It is necessary to have some experience working with video codecs, for example experience contributing to the H.264/5 codec, in order to place the payload in a webcam stream.It is definitely NOT possible to learn the format within the time available.Don't ask if it is possible to read up on it instead: it isn't.

B 2017-18C Steganography means hiding a hidden payload within an apparently-innocent cover, usually an item of digital media.This project is to implement a transparent proxy which uses something like a webcam video stream to hide steganographic packets.A local process should receive communication on a local socket and merge it with the webcam data stream, a process which can be reversed at the receiver.Prerequisites: It is necessary to understand something of video formats, for example experience with working on the H.264/5 codec, in order to place the payload in a webcam stream.

It is definitely NOT possible to learn the format within the time available.Don't ask if it is possible to read up on it instead: it isn't.Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Ker but should note that the response may be delayed as he is on sabbatical.B 2017-18C Steganography means hiding a hidden payload within an apparently-innocent cover, usually an item of digital media.Most pure research focuses on bitmap or JPEG images, or simple video codes.

In practice, there are often further constraints: the carrier might recompress or damage the object.First we must characterize the properties of image transmission through Twitter, and then provide an implementation of image steganography through it.This might be a complete piece of software with a nice interface, if the channel is straightforward, or a proof-of-concept if the channel causes difficult noise (in which case we will need suitable error-correcting codes.You will need a Twitter account and to learn the Twitter API without help from me.It would be useful to know a little about the JPEG image format before starting the project.B 2017-18C Steganography means hiding a hidden payload within an apparently-innocent cover, usually an item of digital media (in this project: images).Steganalysis is the art of detecting that hiding took place.The most effective ways to detect steganography are machine learning algorithms applied to "features" extracted from the images, trained on massive sets of known cover and stego objects.

The images are thus turned into points in high-dimensional space.We have little intuition as to the geometrical structure of the features (do images form a homogeneous cluster? do they scale naturally with image size? how much and in what ways do images from different cameras differ?), or how they are altered under embedding (do they move in broadly the same direction? is there is linear separator of cover from stego?).This is a programming project that creates a visualization tool for features, extracting them from images and then projecting them onto 2-dimensional space in interesting ways, while illustrating the effects of embedding.Prerequisites: Linear Algebra, Computer Graphics.B 2017-18C The goal of this project is to develop bots for the board game Hex.In a previous project, an interface was created to allow future students to pit their game-playing engines against each other.In this project the goal is to program a strong Hex engine.The student may choose the algorithms that underly the engine, such as alpha-beta search, Monte-Carlo tree search, or neural networks.The available interface allows a comparison between different engines.

It is hoped that these comparisons will show that the students' engines become stronger over the years.The project involves reading about game-playing algorithms, selecting promising algorithms and datastructures, and design and development of software (in Java or Scala).06848) that there is a 6x11-matrix M with rational nonnegative entries so that for any factorisation M = W H (where W is a 6x5-matrix with nonnegative entires, and H is a 5x11-matrix with nonnegative entries) some entries of W and H need to be irrational.

The goal of this project is to explore if the number of columns of M can be chosen below 11, perhaps by dropping some columns of M.

This will require some geometric reasoning.The project will likely involve the use of tools (such as SMT solvers) that check systems of nonlinear inequalities for solvability.Prerequisites: The project is mathematical in nature, and a good recollection of linear algebra is needed.Openness towards programming and the use of tools is helpful.B 2017-18C One of the fundamental problems at the interface of Algorithms and Economics is the problem of designing algorithms for simple trade problems that are incentive compatible.

An algorithm is incentive compatible when the participants have no reason to lie or deviate from the protocol.In general, we have a good understanding when the problem involves only buyers or only sellers, but poor understanding when the market has both buyers and sellers.This project will investigate and implement optimal algorithms for this problem.It will also consider algorithms that are natural and simple but may be suboptimal.Prerequisites: Mathematical and algorithmic maturity.

Fundamentals of game theory is useful, but not essential B 2017-18C Bitcoin is a new digital currency that is based on distributed computation to maintain a consistent account of the ownership of coins.The main difficulty that the bitcoin protocol tries to address is to achieve agreement in a distributed system run by selfish participants.To address this difficulty, the bitcoin protocol relies on a proof-of-work scheme.Since the participants in this proof-of-work scheme want to maximize their own utility, they may have reasons to diverge from the prescribed protocol.The project will address some of the game-theoretic issues that arise from the bitcoin protocol Prerequisites: Mathematical and algorithmic maturity.

Fundamentals of game theory is useful, but not essential C The goal of deductive machine learning is to provide computers with the ability to automatically learn a behaviour that provably satisfies a given high-level specification.As opposed to techniques that generalise from incomplete specifications (e.examples), deductive machine learning starts with a complete problem description and develops a behaviour as a particular solution.Potential applications of deductive machine learning are detailed below, and a student would focus on one of these items for their project.

We envisage applying existing algorithms, with potential to develop new ones.- Game playing strategy: given the specification of the winning criteria for a two-player game, learn a winning strategy.- Program repair: given a buggy program according to a correctness specification, learn a repair that makes the program correct.- Lock-free data structures: learn a data structure that guarantees the progress of at least one thread when executing multi-threaded procedures, thereby helping to avoid deadlock.- Security exploit generation: learn code that takes advantage of a security vulnerability present in a given software in order to cause unintended behaviour of that software.

- Compression: learn an encoding for some given data that uses fewer bits than the original representation.This can apply to both lossless and lossy compression.B 2017-18C The Intel Galileo boards are micro controller boards that can be networked, and also conform to the Arduino standard so that devices (sensors, actuators, screens, I/O devices) can be attached to them and controlled.The task is to build a device resembling a system-on-chip, i.

, a big system with several devices that work in concert together.An obvious example would be to build your own smartphone.It should provide the ability to initiate and receive calls, feature an MP3 player, and possibly some other phone features – alarm clock, GPS, and so on.You will be given a small budget to add the devices you need.

Part of the challenge is restricted battery power — how long can you run your phone off a largish capacitor? C Professor Marta Kwiatkowska is happy to supervise projects in the area of quantitative/probabilistic modelling, verification and synthesis, particularly those relating to the PRISM model checker.PRISM is an open source formal verification tool for analysis of probabilistic systems.PRISM has anextensive website which includes software for download, tutorial, manual, publications and many case studies.Students' own proposals in the broad area of theory, algorithms and implementation techniques for software verification/synthesis will also be considered.Below are some concrete project proposals: Modelling trust in human-robot interaction.

 When human users interact with autonomous robots, appropriate notions of computational trust are needed to ensure that their interactions are safe and effective: too little trust can lead to user disengagement, and too much trust may cause damage.Trust management systems have been introduced for autonomous agents on the Internet, but need to be adapted to the setting of mobile robots, taking into account intermittent connectivity and uncertainty on sensor readings.This project aims to develop an implementation of a simplified model checking algorithm.The project will suit a student interested in theory and/or software implementation.

This project is concerned with synthesising strategies for autonomous driving directly from requirements expressed in temporal logic, so that they are correct by construction.Probability is used to quantify information about hazards, such as accidents hotspots.Inspired by the DARPA Urban Challenge, a method for synthesising strategies (controllers) from multi-objective requirements was developed and validated on map data for villages in Oxfordshire ().

The idea is to develop the techniques further, to allow high-level navigation based on waypoints, and to develop strategies for avoiding threats, such as road blockage, at runtime.

In the longer term, the goal is to validate the methods on realistic scenarios in collaboration with the Mobile Robotics Group.The project will suit a student interested in theory or software implementation.For more information about the project see / Controller synthesis for robot coordination. Autonomous robots have numerous applications in scenarios such as warehouse management, planetary exploration, or search and rescue.In view of environmental uncertainty, such scenarios are modelled using Markov decision processes.

the goods should be delivered to location A while avoiding the hazardous location B) can be conveniently specified using temporal logic, from which correct-by-construction controllers (strategies) can be generated.This project aims to develop a PRISM model of a system of robots for a particular scenario so that safety and effectiveness of their cooperation is guaranteed.Techniques based on machine learning, and specifically real-time dynamic programming (), will be utilised to generate controllers directly from temporal logic goals.

This project will suit a student interested in machine learning and software implementation.Modelling and verification of DNA programs. DNA molecules can be used to perform complex logical computation.DNA computation differs from conventional digital computation and is sometimes referred to as ‘computing with soup’.Correct design of DNA devices is error-prone and the task is supported by tools such as the DNA Strand Displacement (DSD) programming language and simulator ( /en-us/projects/dna/ ) developed at Microsoft.

The DSD tool enables the probabilistic model checking of DNA circuits and has been used to identify a flaw in a DNA transducer gate (see / ?key=LPC+12).The aim of this project is to model and analyse DNA implementation of logic inference proposed in “Autonomous Resolution Based on DNA Strand Displacement”, DNA Computing and Molecular Programming, LNCS 6937, 2011.The project will suit a student interested in modelling DNA programs using DSD and/or PRISM.For more information about the DNA computing project see / .A Molecular Recorder (with Luca Cardelli).

Recent technological developments allow massive parallel reading (sequencing) and writing (synthesis) of heterogeneous pools of DNA strands.We are no longer limited to simple circuits built out of a small number of different strands, nor to reading out a few bits of output by fluorescence.While these C Professor Marta Kwiatkowska is happy to supervise projects in the area of safety assurance and automated verification for deep learning.Below are some concrete project proposals: Safety Testing of Deep Neural Networks.

 Despite the improved accuracy of deep neural networks, the discovery of adversarial examples has raised serious safety concerns.07859) a method was proposed for searching for adversarial examples using the SIFT feature extraction algorithm.The method is based on a two-player turn-based stochastic game, where the first player's objective is to find an adversarial example by manipulating the features, and proceeds through Monte Carlo tree search.It was evaluated on various networks, including YOLO object recognition from camera images.

This project aims to adapt the techniques to object detection in lidar images such as Vote3D ( /efficient-object-detection-from-3d-point-clouds/), utilising the Oxford Robotics Institute dataset ( /datasets/).Safety Testing of End-to-end Neural Network Controllers.It inputs camera images and produces a steering angle.The network is trained on data from cars being driven by real drivers, but it is also possible to use the Udacity simulator to train it.

Safety concerns have been raised for neural network controllers because of their vulnerability to adversarial examples – an incorrect steering angle may force the car off the road.07859) a method was proposed for searching for adversarial examples using the SIFT feature extraction algorithm.The method is based on a two-player turn-based stochastic game, where the first player's objective is to find an adversarial example by manipulating the features, and proceeds through Monte Carlo tree search.This project aims to use these techniques to evaluate the robustness of PilotNet to adversarial examples.

Universal L0 Perturbations for Deep Neural Networks (with Wenjie Ruan).Since deep neural networks (DNNs) are deployed in autonomous driving systems, ensuring their safety, security and robustness is essential.Such perturbations can be generated by adversarial attackers.Most current adversarial perturbations are designed based on the L1, L2 or Linf norms.

Recent work demonstrated the advantages of perturbations based on the L0-norm ( /papers/2017 sp ).This project, aims to, given a well-trained deep neural network, demonstrate the existence of a universal (image-agnostic) L0-norm perturbation that causes most of images to be misclassified.The work will involve designing a systematic algorithm for computing universal perturbations and empirically analysing these to show whether they generalize well across neural networks.This project is suited to students familiar with neural networks and Python programming.

B 2017-18C Networks often contain multiple vulnerabilities and weaknesses which can be exploited by attackers.

Security analysts often find it necessary to perform n Network scanning, probing and vulnerability testing aids the process of discovering and correcting network vulnerabilities.There exist a number of challenges in terms of gathering network data in this manner: -Scalability.Scanning/probing techniques do not scale well to large networks (Lippmann et al.A number of approaches – such as that by Roschke et al.(2011) attempted to combine and correlate the results returned from multiple vulnerability databases.However, the databases return the results in different formats - some in textual format, others in XML format.This means that the results have to be unified into a common meaningful format -Data Consolidation.

Other than Rosschke et al’s (2009) study into combining network scanning methods, there have been very few studies which critically evaluate methods of combining the use and results of multiple scanning tools -Performance.Scanning and then combining the results from multiple scanners takes a lot of time (Cheng et al.Quite often the network configuration changes during the scan – which means that the results are quite often inaccurate.This thesis addresses one or more of the challenges presented above.

Prerequisites: This project involves practical work which may involve setting up a virtual network and applying a range of scanning, probing and vulnerability testing mechanisms on the network.B 2017-18C The calculation of network security metrics is a complex problem which involves understanding the state and configuration of network connections, devices and protocols.This research analyses the problem of calculating network security metrics and proposes a framework which can calculate security metrics for a typical small network comprising of numerous devices, operating systems and hosts.The project will involve the configuration of virtual networks for testing the framework.CVSS or other metric scores may be used to aid the calculation of the metric.

This dissertation involves some practical work which may involve setting up a virtual network which has a number of machines on the network and applying a range of scanning, probing and vulnerability testing mechanisms on the network.C Description: Blood pressure is a widely used biomarker to stratify and diagnose several cardiovascular diseases.Recent advances in imaging and computational techniques now enable the non-invasive assessment of blood pressure gradients in our central circulatory system (heart chambers and main vessels), and have the potential to improve our capability to understand disease processes of our cardiovascular system.This project will contribute to the development of these novel technologies, and to the analysis of the distribution of pressure fields in the human aorta.The student will engage with a project at its early stage, having the opportunity to feel the thrill of exploring novel clinical data never analysed before.

He will engage in a multidisciplinary team with cardiologist from the John Radcliffe Hospital in the search for explanations and justification of findings.Experience with finite element methods (Navier Stokes) and image analysis is an advantage.B 2017-18C Constraint programming is a programming paradigm, in which programmers write programs by specifying the problem description (in terms of constraints) and letting computers use their computational power to figure out the solution automatically.

This is in contrast to typical programming paradigms, wherein programmers explicitly spell out procedures for carrying out the computation.Constraint programming has a wide range of applications (e.optimisation, planning, ) and rely on powerful constraint solvers to solve computational problems.SAT-solvers --- claimed by some researchers to be among the greatest achievements in the past decade --- are one of the most powerful solvers in the constraint programming toolbox.

In a nutshell, SAT-solvers are algorithms for solving satisfiability of boolean formulas, which is an NP-complete problem that can be used as a universal language for encoding many practical combinatorial problems.Although the problem of satisfiability of boolean formulas is difficult in theory, SAT-solvers have greatly advanced in the past two decades to the extent that large formulas (with millions of variables/ clauses) can now be handled.

Computer science university of oxford

The aim of the project is to investigate ways in which to improve the performance of SAT-solvers by embedding symmetry information (e.Boolean formulas that arise in practice (e.as encodings of combinatorial problems) exhibit many symmetries, which the programmers typically know Multicultural Case Studies of Academic Literacy Practices in Higher Education Christine Pears Casanave. Newkirk, T. (1992). The narrative roots of case study. In G.Kirsch & P.A.Sullivan (Eds.), Methods and methodology in composition research (pp. 130–152). Carbondale: Southern Illinois University Press. Norton, B..as encodings of combinatorial problems) exhibit many symmetries, which the programmers typically know.

Specific questions to explore include: (1) what is a convenient (but general) way to specify symmetry information in boolean formulas? (2) given the symmetry information, how do we engineer SAT-solvers to exploit such information to speed up the search for solutions? C Concurrent programs on modern architectures can often behave in surprising ways: the presence of caches means that writes can take time to propagate from one thread to another; the compiler can perform optimisations that cause operations to be re-ordered; similarly, the hardware may execute operations out of order.The Java Memory Model (JMM) gives a formal definition of the behaviours that are allowed for such programs.Unfortunately, the definition is convoluted and hard to understand Case studies by industrial experts will also be given, and students will undertake a substantial project. For further information and application forms please write to: Mr S. R. Wilbur, Department of Computer Science, University College London, Gower Street, London WCIE 6BT. FICI.I.()\\'SIIIPS. GRANTS & SCHOLARSHIPS  .Unfortunately, the definition is convoluted and hard to understand.The aim of this project is to aid better understanding of the JMM, by producing a tool that, given a small program (like the one above or the ones in 4 ), returns all its valid executions.References: 1 Java Memory Model Pragmatics (transcript), Aleksey Shipilev, /blog/2014/jmm-pragmatics engineering technologies.

References: 1 Java Memory Model Pragmatics (transcript), Aleksey Shipilev, /blog/2014/jmm-pragmatics.

2 Formalising Java's Data Race Free Guarantee, David Aspinall and Jaroslav Sevcik.

3 Jeremy Manson, William Pugh and Sarita Adve, The Java Memory Model.4 William Pugh and Jeremy Manson, Java memory model causality test cases (2004), /~pugh/java/memoryModel/ .B 2017-18C Several projects are available to implement new algorithms or protocols into the MOSAICS ( /mosaics) software package.The first project would aim for the development of new analysis tools to interpret the simulation result.For example, the new protocol would take (input) a structural similarity mearure and a trajectory of simulated conformations and would produce (output) a measure of structural diversity of conformations visited.

In the second project, students would implement and compare different physical models to describe hydrogen bonding, which is among the most important canonical interactions that stabilize the double helical DNA.Prerequisites:Geometric Modelling and Numerical Methods (e.solution of Ordinary Differential Equations) and their applications to atomistic simulations.B 2017-18C Chemical modifications such as (hydroxy)methylation on nucleic acids are used by the cell for silencing and activation of genes.

These so called epigenetic marks can be recognized by ‘protein readers’ indirectly due to their structural ‘imprints', the effects they impose on DNA structure.The project include the development of computational protocols to assess the effect of epigenetic modifications on DNA structure.This research may shed light on how different epigentic modifications affect the helical parameters of the double stranded DNA.Prerequisites: Strong interest in visualizing, analysing and comparing 3D objects and modelling molecular structures.The project can be tailored to suit those from a variety of backgrounds but would benefit from having taken the following courses: Computer Graphics, B 2017-18C Commercial use of the Internet is becoming more and more common, with an increasing variety of goods becoming available for purchase over the Net.

Clearly, we want such purchases to be carried out securely: a customer wants to be sure of what (s)he's buying and the price (s)he's paying; the merchant wants to be sure of receiving payment; both sides want to end up with evidence of the transaction, in case the other side denies it took place; the act of purchase should not leak secrets, such as credit card details, to an eavesdropper.The aim of this project is to find out more about the protocols that are used for electronic commerce, and to implement a simple e-commerce protocol.In more detail: Understand the requirements of e-commerce protocols; Specify an e-commerce protocol, both in terms of its functional and security requirements; Understand cryptographic techniques; Understand how these cryptographic techniques can be combined together to create a secure protocol - and understand the weaknesses that allow some protocols to be attacked; Design a protocol to meet the requirements identified; Implement the protocol.A variant of this project would be to implement a protocol for voting on the web (which would have a different set of security properties).Prerequisites for this project include good program design and implementation skills, including some experience of object-oriented programming, and a willingness to learn about protocols and cryptography.

The courses on concurrency and distributed systems provide useful background for this project.1 Jonathan Knudsen, Java Cryptography, O'Reilly, 1998.C "Apache Spark is today one of the most popular frameworks for large-scale data analysis.Spark offers a functional (Scala-like) API for processing data collections that are distributed over a cluster of machines.Its declarative approach, domain-specific libraries (e.

, for machine learning and graph processing), and high performance have enabled its wide adoption in the industry.Although Spark can transform collections of arbitrary types, it can exhibit severe performance problems when processing nested data formats such as JSON and XML.In particular, distributed processing of datasets where nested collections have skewed cardinalities (e., one extremely large, others small nested collections) leads to uneven distribution of work among the machines.In such cases, developers typically have to undergo a painful process of manual query re-writing to avoid load imbalance for large inner collections in their workloads.This project aims to extend the Spark API with a new functionality that would automatically transform user queries to avoid data skews.This project is a great opportunity for students to understand how Apache Spark works under the hood and to contribute to an open-source project." B 2017-18C Model checking has emerged as a powerful method for the formal verification of programs.

Temporal logics such as CTL (computational tree logic) and CTL* are widely used to specify programs because they are expressive and easy to understand.Given an abstract model of a program, a model checker (which typically implements the acceptance problem for a class of automata) verifies whether the model meets a given specification.A conceptually attractive method for solving the model checking problem is by reducing it to the solution of (a suitable subclass of) parity games.These are a type of two player infinite game played on a finite graph.The project concerns the connexions between the temporal logics CTL and / or CTL*, automata, and games.

Some of the following directions may be explored.Representing CTL / CTL* as classes of alternating tree automata.Inter-translation between CTL / CTL* and classes of alternating tree automata 3.

Using B¨uchi games and other subclasses of parity games to analyse the CTL / CTL* model checking problem 4.Efficient implementation of model checking algorithms 5.Application of the model checker to higher-order model checking.Vardi, Pierre Wolper: An automata-theoretic approach to branchingtime model checking.

B 2017-18C There is an enormous amount of information on constructing various sorts of ``interesting'', in one or another way, mathematical objects, e.block designs, linear and non-linear codes, Hadamard matrices, elliptic curves, etc.

There is considerable interest in having this information available in computer-ready form.However, usually the onlyavailable form is a paper describing the construction, while no computer code (and often no detailed description of a possible implementation) is provided.This provides interesting algorithmic and software engineering challenges in creating verifiable implementations; properly structured and documented code, supplemented by unit tests, has to be provided, preferably in functional programming style (althoughperformance is important too).The project will contribute to such implementations.It works by delivering small doses of tracer gases into the patient breaths and measuring the responses in the expired breaths.The device is being developed towards commercialisation.A lung simulation that could be used by non-expert computer scientists (such as nurses and medical doctors) would be a useful addition to the technology.Objectives: To develop a user-friendly lung simulation to help predicting the responses of the inspired sinewave tests in various lung conditions, from healthy to diseased.The simulation consists of 2 key parts: the GUI and the lung model.

The GUI: (i)The GUI would allow users to enter different inspired sinewave test settings and change parameters of the lung model.(ii)The output of the simulation would be a visualisation of the simulated test and also an excel file of simulated data for back-testing of the device algorithms (already developed).The lung model: the mathematical equations for the model have already been established and will need to be implemented in either Matlab or Simulink.Details of the mathematical lung model are available up on request.Prerequisites: The student should: be competent in Matlab (Simulink would be a bonus).

be competent in implementing differential equations.have an interest in computational biology and mathematical modelling.B 2017-18C The aim of this project is to build an educational tool which enables the progress of a Bayesian parameter estimation algorithm to be visualised.The model to be fitted might be (but is not limited to) a system of Ordinary Differential Equations and the Bayesian estimation tools might be build around an existing system such as Stan, PyML or Edward.A good tutorial system should be able to let the user change the underlying model system, introduce noise to a system, visualise interactive updates to probability distributions, explore the progress of a chosen sampling method such as Metropolis-Hastings and provide enough information that a novice student can get an intuition into all aspects of the process.

B 2017-18C This project involves running cardiac cell models on a high-end GPU card.Each model simulates the electrophysiology of a single heart cell and can be subjected to a series of computational experiments (such as being paced at particular heart rates).The goal of the project is to add functionality to the compiler in order to get OpenCL or CUDA implementations of the same cell models and to thus increase the efficiency of the "Web Lab".B 2017-18C I am interested in supervising general projects in the area of computer graphics.

If you have a particular area of graphics-related research that you are keen to explore then we can tailor a bespoke project for you.Specific projects I have supervised in the past include "natural tree generation" which involved using Lindenmayer systems to grow realistic looking bushes and trees to be rendered in a scene; "procedural landscape generation" in which an island world could be generated on-the-fly using a set of simple rules as a user explored it; "gesture recognition" where a human could control a simple interface using hand-gestures; "parallel ray-tracing" on distributed-memory clusters and using multiple threads on a GPU card; "radiosity modelling" used for analysing the distribution of RFID radio signal inside a building; and "non-photorealistic rendering" where various models were rendered with toon/cel shaders and a set of pencil-sketch shaders.B 2017-18C I am interested in novel visualisation as a way to represent things in a more appealing and intuitive way.For example the Gnome disk usage analyzer (Baobab) uses either a "ring chart" or "treemap chart" Representation to show us which sub-folders are using the most disk.In the early 1990s the IRIX file system navigator used a 3D skyscraper representation to show us similar information.

There are plenty more ways of representing disk usage: from DAGs to centralised Voronoi diagrams.What kind of representation is most intuitive for finding a file which hogging disk-space and which is most intuitive for helping us to remember where something is located in the file-system tree? The aim is to explore other places where visualisation gain intuition: for example, to visualise the output of a profiler to find bottlenecks in software, to visual a code coverage tool in order to check that test-suites are are testing the appropriate functionality or even to visualise the prevalence of diabetes and heart disease in various regions of the country.C Incremental dependency parsers such those described in /doi/pdf/10.07-056-R1-07-027 typically try to predict what parsing action to take next by training a classifier which will look ahead in the input, and at the current parse state, and make a choice between actions.

This project aims to experiment to see whether a reinforcement learning decision component could lead to better parsing performance than the more usual classifier-based decisions.It can be used as a high performance library to implement numerical methods such as Molecular Dynamics in computational chemistry, or Gaussian Processes for machine learning.Project 1: Aboria features a radial neighbour search to find nearby particles in the n-dimensional space, in order to calculate their interactions.

This project will implement a new algorithm based on calculating the interactions between neighbouring *clusters* of particles.Its performance will be compared against the existing implementation, and across the different spatial data structures used by Aboria (Cell-list, Octree, Kdtree).Prerequisites: C++ Project 2: Aboria features a serial Fast Multipole Algorithm (FMM) for evaluating smooth long range interactions between particles.This project will implement and profile a parallel FMM algorithm using CUDA and/or the Thrust library.

Prerequisites: C++, Knowledge of GPU programming using CUDA and/or Thrust Project 3: The main bottleneck of the FMM is the interactions between well-separated particle clusters, which can be described as low-rank matrix operations.

This project will explore different methods compressing these matrices in order to improve performance, using either Singular Value Decomposition (SVD), Randomised" SVD, or Adaptive Cross Approximation Prerequisites: C++, Linear Algebra B 2017-18C Description: A Boolean program is one where all variables are of Boolean type.In the context of formal verification of concurrent software, concurrent Boolean programs (CBPs) are generated by a process of abstracting the original program, in such a way that the original analysis problem reduces now to an equivalent problem on the CBP.The unfolding technique is a well studied verification method for another model of concurrency called Petri nets.Unfoldings represent the behaviour of a Petri net in a compact way, by means of a partial-order enriched with additional information.Not only this this representation is a theoretically neat one, but also very efficient for practical verification.

The goal of the project is to apply the unfolding technique to the verification of CBPs, and compare with existing verification techniques for CBPs.Prerequisites: suitable for students having followed the course "Computer-Aided Formal Verification" B 2017-18C Field-programmable gate arrays (FPGA) are integrated circuits that can be configured after manufacture into almost any conceivable logic circuit combining both combinatorial and synchronous elements.They can be used to explore real hardware implementations of processor designs from simple accumulator machines, through to register machines, and more unusual stack machines.Their use outside industry has previously been limited due to the high cost and complexity of their associated development software.However, this is changing, with an open-source toolchain for popular FPGA devises becoming available in the last year (the equivalent of Linux in the OS world).

This project will build on this toolchain to develop the additional software necessary to allow students to design and explore simple processor designs on a custom FPGA development board.The current toolchain requires the use of the Verilog hardware description language.Verilog is very powerful but also very general.This project will develop a high-level language (possibly a graphical language) focused on the development of simple processor designs from a small number of standard components (such as RAM, multiplexers and registers).Prerequisites: Digital Systems, Compilers & Computer Architecture useful but not essential.

B 2017-18C Convolution neural networks have made dramatic advances in recent years on many image and vision processing tasks.While training such networks is computationally expensive (typically requiring very large image datasets and exploiting GPU acceleration), they can often be deployed on much simpler hardware if simplifications such as integer or even binary weights are imposed on the network.This project will explore the deployment of trained convolution networks on microcontrollers (and possibly also FPGA-based hardware) with the intention of demonstrating useful image processing (perhaps recognising the presence of a face in the field of view of a low pixel camera) on low-power devices.Prerequisites: Machine Learning & Computer Architecture useful but not essential.B 2017-18C While the architecture of current reduced instruction set processors is well established, and relatively static, the early days of computing saw extensive experimentation and exploration of alternative designs.

Commercial processors developed during the 1960s, 1970s and 1980s included stack machines, LISP machines and massively parallel machines, such as the Connection Machine (CM-1) consisting of 65,536 individual one-bit processors connected together as a 12-dimensional hypercube. This period also saw the development of the first single chip microprocessors, such as the Intel 4004, and the first personal computers, such as the Altair 8800 using the Intel 8080 microprocessor.This project will attempt to resurrect one of these extinct designs (or a scaled down version if necessary) using a modern low-cost field-programmable gate array (FPGA). You will be required research the chosen processor, using both original and modern sources, and then use Verilog to develop a register level description of the device that can be implemented on a FPGA.The final device should be able to run the software of the original and could be realised in a number of different forms depending on the chosen processor (e.

an Altair 8800 on a small USB stick running Microsoft BASIC).Prerequisites: Digital Systems or Computer Architecture useful but not essential B 2017-18C Being able to define the units of constants and variables in a programming language has great value in many applications.NASA's Mars Climate Orbiter was lost in 1999 due to software that calculated trajectory thruster firings in pounds seconds, rather than newton-seconds.In other cases, dimensional analysis (statically checking that the computed units match those that are expected) is sufficient to catch many errors in calculations.

While F# supports units natively, and libraries exist in many others languages (e.Java, Haskell, Python), none are particularly easy to use, and often introduce clumsy syntax.This project will build improve on these approaches.You will be required to develop a unit-aware interactive programming environment enabling unit-safe physics based calculations to be performed.

This might be a stand-alone solution, or a kernel for an existing interactive computing environment such as Project Jupyter ( ).Prerequisites: Compilers useful but not essential B 2017-18C Timed CSP reinterprets the CSP language in a real-time setting and has a semantics in which the exact times of events are recorded as well as their order.Originally devised in the 1980s, it has only just been implemented as an alternative mode for FDR.The objective of this project is to take one or more examples of timed concurrent system from the literature, implement them in Timed CSP, and where possible compare the performance of these models with similar examples running on other analysis tools such as Uppaal.References: (Reference Understanding Concurrent Systems, especially Chapter 15, and Model Checking Timed CSP, from AWR's web list of publications) B 2017-18C The existing JIT for Keiko is very simple-minded, and does little more than translate each bytecode into the corresponding machine code.

Either improve the translation by using one of the many JIT libraries now available, or adjust the Oberon compiler and the specification of the bytecode machine to free it of restrictive assumptions and produce a better pure-JIT implementation.B This project is not available to MSc students in 2016-17.Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical.But there is still a gap between the performance of GeomLab programs and similar ones written in Java or C, and more ambitious image-processing tasks would be made possible by better performance, particularly in the area of arithmetic.Explore ways of improving performance, perhaps including the possibility of improving the performance of GeomLab by allowing numbers to be passed around without wrapping them in heap-allocated objects, or the possibility of compiling the code for Haskell-style pattern matching in a better way.This project is not available to MSc students in 2016-17.

Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical.B 2017-18C The Oberon compiler inserts code into every array access and every pointer dereference to check for runtime errors, like a subscript that is out of bounds or a pointer that is null.In many cases, it is possible to eliminate the checks because it is possible to determine from the program that no error can occur.For example, an array access inside a FOR loop may be safe given the bounds of the loop, and several uses of the same pointer in successive statements may be able to share one check that the pointer is non-null.Modify the Oberon compiler (or a simpler one taken from the Compilers labs) so that it represents the checks explicitly in its IR, and introduce a pass that removes unnecessary checks, so speeding up the code without compromising safety.

This project is not available to MSc students in 2016-17.Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical.B 2017-18C At present, Keiko supports only conventional Pascal-like language implementations that store activation records on a stack.

Experiment with an implementation where activation records are heap-allocated (and therefore recovered by a garbage collector), procedures are genuinely first-class citizens that can be returned as results in addition to being passed as arguments, and tail recursion is optimised seamlessly.This project is not available to MSc students in 2016-17.Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical.

B 2017-18C Alternative firmware for the Mindstorms robot controller provides an implementation of the JVM, allowing Java programs to run on the controller, subject to some restrictions.Using this firmware as a guide, produce an interpreter for a suitable bytecode, perhaps some variant of Keiko, allowing Oberon or another robot language of your own design to run on the controller.Aim to support the buttons and display at first, and perhaps add control of the motors and sensors later.This project is not available to MSc students in 2016-17.Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical.B 2017-18C The GeomLab language is untyped, leading to errors when expressions are evaluated that would be better caught at an earlier stage.Most GeomLab programs, however, follow relatively simple typing rules.The aim in this project is to write a polymorphic type checker for GeomLab and integrate it into the GeomLab system, which is implemented in Java.

Best websites to order a case study computer science single spaced college freshman 9 days us letter size

A simple implementation of the type-checker would wait until an expression is about to be evaluated, and type-check the whole program at that point.B This project is not available to MSc students in 2016-17.Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical 9 Oct 2017 - Facing these challenges is the aim of Computer Science as a practical discipline, and this leads to some fundamental questions:How can we capture in a precise way what we want a computer system to do?Can we mathematically prove that a computer system does what we want it to?How can computers  .

Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Spivey but should note that the response may be delayed as he is on sabbatical.

C More students are enrolling in college and professional degree programs than ever before.However, current degree programs are often “one size fits all”; such programs ignore the heterogeneity of students in terms of backgrounds, abilities, learning styles and career goals.Moreover, because of ever-increasing student/teacher ratios, students are often left struggling to find their own pathways through degree programs Writing Games Multicultural Case Studies of Academic Literacy nbsp.Moreover, because of ever-increasing student/teacher ratios, students are often left struggling to find their own pathways through degree programs.The combination leads to poor learning outcomes, low engagement, dissatisfaction and high dropout rates.In this project, an interactive electronic system will be built that is personalized for each student, is able to continuously track progress and goals, capitalize on the knowledge accumulated, and recommend suitable courses and activities in order to build skills, enhance interest and promotelong-term goals equine-research-inc.com/laboratory-report/how-to-order-an-ecology-laboratory-report-harvard-5-days-british.

In this project, an interactive electronic system will be built that is personalized for each student, is able to continuously track progress and goals, capitalize on the knowledge accumulated, and recommend suitable courses and activities in order to build skills, enhance interest and promotelong-term goals.

In effect, our personalized interactive system operates as “if” there is a dedicated mentor for each student.To build this system, the following modules will need to be developed: (1) student and course similarity discovery methods; (2) student performance prediction algorithms; (3) personalized course recommendation algorithms.To read more about the role of machine learning in education – see /EduAdvance Prerequisites: This project is suitable for someone with at least basic knowledge of machine learning B 2017-18C A large international community of researchers is trying to use computers to allow groups of people (or groups of automated agents) make better joint decisions.This is a hard problem, since the preferences that agents report might contradict each other, and this leads to so-called voting paradoxes.Also, it can be computationally hard to calculate what decisions to make.

A promising way to tackle this problem is by exploiting structure in the reported preferences.For this purpose, Australian researchers have collected an impressive amount of real-world preference data (PrefLib /) comprising over 3,000 data sets coming from user preferences taken from places as diverse as rating systems at Netflix and TripAdvisor, and real political election data, with the aim of figuring out how we might use properties of preferences that actually occur in the real world.This project is about analysing this data to reveal how much structure is contained in these preferences.Technically, we are interested in figuring out how close the preferences reported are to what are known as single-peaked and/or single-crossing preferences (see, e.There are different measures of closeness, and for many of them the associated decision problem is NP-hard; for others, the computational complexity is not known.Prerequisites: There are several ways in which this project can be pursued.On the one hand, one can consider the notions of closeness for which the complexity of the associated problem is not known, and try to develop an efficient algorithm or prove NP-hardness.

On the other hand, one can try to develop practical algorithms for detecting profiles that are almost single-peaked/single-crossing, by encoding the associated problem as an instance of SAT or an integer linear program and running a respective solver; such algorithms could then be applied to PrefLib data.

C This project aims to use machine learning techniques such as ensemble learning, convolutional neural networks etc.to predict spot prices for a variety of industries.Machine learning is increasingly used in finance to make predictions as well as to aggregate among existing strategies for making investments over time.We will use various free as well as proprietary data sets to assess the value of our newly developed methods in terms of both profit and risk, and compare them with state of the art techniques.This will also involve developing new “lucky factors” (features) that can be extracted from the data to inform and improve existing and new investment strategies.

The expectation is that the work will lead to a conference publication.Prerequisites: This project is suitable for someone with at least basic knowledge of machine learning.C The first part of this project aims to use advanced graphical models (including enhancements of Hidden Markov Models etc.) to discover personalized trajectories for HIV disease progression, using available electronic health record data.The second part of the project aims to learn how personalized treatment and screening plans can affect disease trajectories in the short run and in the long run, with the overall goal of identifying effective treatment plans for various types of patients.

The dataset contains various types of patients and their responses to different medications over time.The project will involve also interacting with a renowned clinician specializing in HIV.In the short run, this work will lead to a publication in an important conference.In the longer run, this work – and more generally, the development of these methods – will change and advance the way medicine is practiced.To read more about the role of machine learning in medicine – see /MedAdvance Prerequisites: This project is suitable for someone with at least basic knowledge of neural networks and/or machine learning.

C This project aims to use convolutional neural networks to discover how to best treat patients with asthma, using available electronic health record data.The dataset contains information about various (types of) patients and their responses to different medications over time.The focus of the project is to train a convolutional neural network to identify how to best treat patients over time.The project will involve interacting with a renowned clinician specializing in asthma diagnosis and treatment.In the short run, this work will lead to a publication in an important conference.

In the longer run, this work – and more generally, the development of these methods – will change and advance the way medicine is practiced.To read more about the role of machine learning in medicine – see /MedAdvance Prerequisites: This project is suitable for someone with at least basic knowledge of neural networks and/or machine learning.B 2017-18C The guarded negation fragment of first-order logic is an expressive logic of interest in databases and knowledge representation.It has been shown to have a decidable satisfiability problem but, to the best of my knowledge, there is no tool actually implementing a decision procedure for it.The goal would be to design a tool to determine whether or not a formula in this logic is satisfiable.

Most likely, this would require designing and implementing a tableau-based algorithm, in the spirit of related tools for description logics and the guarded fragment.Prerequisites: Logic and Proof (or equivalent).There are some connections to material in Knowledge Representation and Reasoning, but this is not essential background C Let F1 and F2 be sentences (in first-order logic, say) such that F1 entails F2: that is, any model of F1 is also a model of F2.An interpolant is a sentence G such that F1 entails G, and G entails F2, but G only uses relations and functions that appear in *both* F1 and F2.The goal in this project is to explore and implement procedures for constructing interpolants, particularly for certain decidable fragments of first-order logic.

It turns out that finding interpolants like this has applications in some database query rewriting problems.Prerequisites: Logic and Proof (or equivalent) B 2017-18C After hand surgery, it is almost always necessary for patients to have physiotherapy afterwards to help with their recovery.As part of this, the patient will need to perform hand exercises at home.However, the patient may not always do the exercises correctly, or they might forget to do their exercises.The goal of this project is to use the Leap Motion to create a user-friendly GUI which a patient could use to aid them with their home exercises.

The interface would show the user where their hand should be and they would then need to follow the movements.It could work from a web-based software or a downloaded software.It would need to be tailored to the patient so it contained their specific required exercises, which could be input by the physiotherapist.It would need to store data on how the patient is doing and feedback this data to the patient, and possibly also to the physiotherapist via the internet.If internet-based, patient confidentiality and security would need to be considered.

This project would be performed in close collaboration with a physiotherapist, an orthopaedic hand surgeon, and a post-doctoral researcher based at the Nuffield Orthopaedic Centre.B 2017-18C Computed tomography (CT) scanning is a ubiquitous scanning modality.It produces volumes of data representing internal parts of a human body.Scans are usually output in a standard imaging format (DICOM) and come as a series of axial slices (i.slices across the length of the person's body, in planes perpendicular to the imaginary straight line along the person's spine.) The slices most frequently come at a resolution of 512 x 512 voxels, achieving an accuracy of about 0.5 to 1mm of tissue per voxel, and can be viewed and analysed using a variety of tools.The distance between slices is a parameter of the scanning process and is typically much larger, about 5mm.

During the analysis of CT data volumes it is often useful to correct for the large spacing between slices.

For example when preparing a model for 3D printing, the axial voxels would appear elongated.These could be corrected through an interpolation process along the spinal axis.This project is about the interpolation process, either in the raw data output by the scanner, or in the post-processed data which is being prepared for further analysis or 3D printing.The output models would ideally be files in a format compatible with 3D printing, such as STL.The main aesthetic feature of the output would be measurable as a smoothness factor, parameterisable by the user.

Existing DICOM image analysis software designed within the Spatial Reasoning Group at Oxford is available to use as part of the project.C The Medical Imaging research group has been working with a variety of data sourced from CT and MRI scans.This data comes in collections of (generally greyscale) slices which together make up 3D images.Our group has developed software to generate 3D models of the major organs in these images.This project aims to develop a simple augmented reality simulation for the Oculus Rift which will render these organs within a transparent model of a human and allow the user to walk around the model so as to view the organs from any angle.

This has a number of possible applications, including to train medical students and to help surgeons to explain medical procedures to their patients.C Scientists in the Experimental Psychology Department study patients with a variety of motor difficulties, including apraxia - a condition usually following stroke which involves lack of control of a patient over their hands or fingers.Diagnosis and rehabilitation are traditionally carried out by Occupational Therapists.In recent years, computer-based tests have been developed in order to remove the human subjectivity from the diagnosis, and in order to enable the patient to carry out a rehabilitation programme at home.One such test involves users being asked to carry out static gestures above a Leap Motion sensor, and these gestures being scored according to a variety of criteria.

A prototype has been constructed to gather data, and some data has been gathered from a few controls and patients.In order to deploy this as a clinical tool into the NHS, there is need for a systematic data collection and analysis tool, based on machine learning algorithms to help classify the data into different categories.Algorithms are also needed in order to classify data from stroke patients, and to assess the degree of severity of their apraxia.Also, the graphical user interface needs to be extended to give particular kinds of feedback to the patient in the form of home exercises, as part of a rehabilitation programme.This project was originally set up in collaboration with Prof Glyn Humphreys, Watts Professor of Experimental Psychology.

Due to Glyn's untimely death a new co-supervisor needs to be found in the Experimental Psychology Department.It is unrealistic to assume this project can run in the summer of 2016.C In recent years, medical diagnosis using a variety of scanning modalities has become quasi-universal and has brought about the need for computer analysis of digital scans.Members of the Spatial Reasoning research group have developed image processing software for CT (tomography) scan data.The program partitions (segments) images into regions with similar properties.

These images are then analysed further so that particular features (such as bones, organs or blood vessels) can be segmented out.The team's research continues to deal with each of these two separate meanings of medical image segmentation.The existing software is written in C++ and features carefully-crafted and well-documented data structures and algorithms for image manipulation.orthopaedic surgery involving hip and knee joint) the magnetic resonance scanning modality (MRI) is preferred, both because of its safety (no radiation involved) and because of its increased visualisation potential.This project is about converting MRI scan data into a format that can become compatible with existing segmentation algorithms.The data input would need to be integrated into the group's analysis software in order then to carry out 3D reconstructions and other measurements.This project is co-supervised by Professor David Murray MA, MD, FRCS (Orth), Consultant Orthopaedic Surgeon at the Nuffield Orthopaedic Centre and the Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), and by Mr Hemant Pandit MBBS, MS (Orth), DNB (Orth), FRCS (Orth), DPhil (Oxon)Orthopaedic Surgeon / Honorary Senior Clinical Lecturer, Oxford Orthopaedic Engineering Centre (OOEC), NDORMS.C Scientists in the Experimental Psychology Department study patients with a variety of motor difficulties, including apraxia - a condition usually following stroke which involves lack of control of a patient over their hands or fingers.

Diagnosis and rehabilitation are traditionally carried out by Occupational Therapists.In recent years, computer-based tests have been developed in order to remove the human subjectivity from the diagnosis, and in order to enable the patient to carry out a rehabilitation programme at home.One such test involves users drawing simple figures on a tablet, and these figures being scored according to a variety of criteria.Data has already been gathered from 200 or so controls, and is being analysed for a range of parameters in order to assess what a neurotypical person could achieve when drawing such simple figures.Further machine learning analysis could help classify such data into different categories.

Algorithms are also needed in order to classify data from stroke patients, and to assess the degree of severity of their apraxia.This project was originally co-supervised by Prof Glyn Humphreys, Watts Professor of Experimental Psychology.Due to Glyn's untimely death a new co-supervisor needs to be found in the Experimental Psychology Department.It is unrealistic to assume this project can run in the summer of 2016.C Knee replacement surgery involves a precise series of steps that a surgeon needs to follow.

Trainee surgeons have traditionally mastered these steps by learning from textbooks or experienced colleagues.Surgeons at the Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS) in Oxford have been working on a standardised method to help trainees internalise the sequence of events in an operation.It is proposed to construct a computer-based tool which would help with this goal.Apart from the choice of tools and materials, the tool would also feature a virtual model of the knee.

The graphical user interface will present a 3D model of a generic knee to be operated, and would have the ability for the user to make cuts necessary to the knee replacement procedure.

There would be pre-defined parameters regarding the type and depth of each cut, and an evaluation tool on how the virtual cuts compared against the parameters.The project goals are quite extensive and so this would be suitable for an experienced programmer.This project is co-supervised by Professor David Murray MA, MD, FRCS (Orth), Consultant Orthopaedic Surgeon at the Nuffield Orthopaedic Centre and the Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), and by Mr Hemant Pandit MBBS, MS (Orth), DNB (Orth), FRCS (Orth), DPhil (Oxon)Orthopaedic Surgeon / Honorary Senior Clinical Lecturer, Oxford Orthopaedic Engineering Centre (OOEC), NDORMS.B 2017-18C Conventional computer-aided design (CAD) software uses methods such as extrusion and revolution tools that the user can apply to create the 3D shape of a part.These tools are based on traditional manufacturing methods and work very well for most CAD applications.

One application which these tools do not work well for is creating 3-dimensional representations of textiles.The exact path of each fibre within the textile is dependent upon the other fibres, and the flexibility of the fibres.The purpose of this project is to create a simple software tool/algorithm into which a user can input a weave pattern (flat), or a braid pattern (cylindrical), and the flexibility of the fibres and it will create a 3-dimensional representation of the structure.B 2017-18C Experimental data inevitably contains error.These experimental observations are often compared to theoretical predictions by writing as a least squares problem, i.

minimising the sum of squares between the experimental data and theoretical predictions.These least squares problems are often solved using a QR-factorisation of a known matrix, which uses the Gram-Schmidt method to write the columns of this matrix as a linear sum of orthonormal vectors.This method, when used in practice, can exhibit numerical instabilities, where the (inevitable) numerical errors due to fixed precision calculations on a computer are magnified, and may swamp the calculation.Instead, a modified Gram-Schmidt method is used for the QR-factorisation.

This modified Gram-Schmidt factorisation avoids numerical instabilities, but is less computationally efficient.The first aim of this project is to investigate the relative computational efficiencies of the two methods for QR-factorisation.The second aim is to use the QR-factorisation to identify: (i) what parameters can be recovered from experimental data; and (ii) whether the data can automatically be classified as "good" or "bad".B (2017-18)B 2017-18C Many numerical algorithms have error bounds that depend on some user provided input.For example, the error in a numerical method for solving a differential equation is bounded in terms of the step-size h, and so the user may change the step-size h until a desired accuracy is attained.

Although useful, these error bounds do not take account of computational efficiency.For example, a numerical method for solving a differential equation may have a very impressive bound with respect to step size h, but may require significantly more computational effort than other methods with less impressive error bounds.The field of scientific computing is a rich source of algorithms such as these, for example the numerical solution of differential equations, the numerical solution of linear systems, and interpolation of functions.The aim of this project is to work with a variety of algorithms for solving a given problem, and to assess the computational efficiency of these algorithms.B 2017-18C Many matrix applications involve sparse matrices, i.

matrices that have a very large number of rows and columns, but only a small number of non-zero entries in each row.On a given computer we may only store a finite number of matrix entries.When working with sparse matrices we usually store only the non-zero entries and their locations.There are several established techniques for storing sparse matrices.

These methods all have individual strengths and weaknesses - some allow efficient multiplication of sparse matrices by vectors, others allow entries to be modified effectively, while others allow the sparsity pattern to be modified dynamically.The aim of this project is to investigate these sparse storage methods, and to highlight the strengths and weaknesses of each approach.B 2017-18C The goal of this project is to write a program that model checks a Markov chain against an LTL formula, i., calculates the probability that formula is satisfied.

The two main algorithmic tasks are to efficiently compile LTL formulas into automata and then to solve systems of linear equations arising from the product of the Markov chain and the automaton.An important aspect of this project is to make use of an approach that avoids determinising the automaton that represents the LTL formula.This project builds on material contained in the Logic and Proof and Models of Computation Courses.An optimal automata approach to LTL model checking of probabilistic systems.Proceedings of LPAR'03, LNCS 2850, Springer 2003.B 2017-18C The interval program analysis is a well-known algorithm for estimating the behaviour of programs without actually running them.

The algorithm takes an imperative program, and returns, at each program point, interval constraints for variables in the program, such as 1 <= x <= 3 && 2 <= y.However, in practice, this assumption is not necessarily met.

Need to get an case study computer science college sophomore academic cse 115 pages / 31625 words

Programs often use library functions whose source code is not available.

The goal of this project is to lift this assumption.

During the project, a student will develop an interval-analysis algorithm that works in the presence of calls to unknown library functions, implement the algorithm, and evaluate the algorithm experimentally Department of Computer Science University of Oxford.During the project, a student will develop an interval-analysis algorithm that works in the presence of calls to unknown library functions, implement the algorithm, and evaluate the algorithm experimentally.

Prerequisites: A prerequisite of this project is the Compiler course.Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Yang but should note that the response may be delayed as he is on sabbatical.C Recently, researchers in machine learning have developed new Turing-complete languages, such as , Anglican and Church, for writing sophisticated probabilistic models and performing various inference tasks on those models, such as the computation of posterior probabilities.The goal of this project is to study these languages using tools from programming languages.

Specifically, a student will work on developing a new inference algorithm for probabilistic programs that mix techniques from program analysis and those from the Monte Carlo simulation, a common method for performing inference on probabilistic programs how to purchase computer science report British 2 days 128 pages / 35200 words.Specifically, a student will work on developing a new inference algorithm for probabilistic programs that mix techniques from program analysis and those from the Monte Carlo simulation, a common method for performing inference on probabilistic programs.Or the student will explore the connection between the use of computational effects in higher-order functional probabilistic programming languages and the encoding of advanced probability models in those languages (in particular, nonparametric Bayesian models), which has been pointed out by the recent work of Dan Roy and his colleagues.Prerequisites: Compiler and Machine learning courses.The Programming Language course is not required, but useful for carrying out this project.Undergraduate students who wish to enquire about a project for 2017-18 are welcome to contact Prof Yang but should note that the response may be delayed as he is on sabbatical.

What is the best thing about studying Computer Science at Balliol? Balliol has got both incredible computer science tutors, who've taught me so much of what I know over the last 3 years, and also one of the most welcoming atmospheres among students I've ever come across.Coming to Balliol I was nervous about the prospect of making new friends alongside having a large amount of work to do, but the tutors and other students all made this transition incredibly easy.The tutors are always willing to organise extra sessions outside of tutorials if you need it, and every student in the college is unbelievably friendly and welcoming.I was especially surprised by how sociable the other students in college are, as that isn't the reputation that Oxford and computer science often have – so please don't let those stereotypes put you off.What advice would you give prospective applicants for your course? Make sure you're not looking to do a large amount of practical programming.

At Oxford we do programming but we also study the theory of Computer Science.I personally find that learning the theoretical side to the subject helps massively when it comes to writing efficient and well-constructed programs, however if what you are after is a degree focused on programming, then Oxford may not be the place for you.///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////> Matthew Hillman What is the best thing about studying Maths & Computer Science at Balliol? When I was applying to study Mathematics and Computer Science, I was worried that I might feel a little bit out of place in both departments.At Balliol, this is certainly not the case.Everyone is very friendly and welcoming.

What's more, most of the in-college social events for the departments are actually run jointly for mathematicians and computer scientists anyway.The tutors at Balliol are incredibly helpful - the subjects are both challenging, but the tutors do an excellent job of highlighting important material and explaining it a way that makes it easier to understand.In my opinion, the subject choice gives you the best of both worlds.On the maths side, you get to do all of the pure maths modules and only miss out on the applied maths modules.On the computing side, you do the theoretical and programming modules and don't do the more practical modules.

In later years, you get more choice in the modules you do so when you find out which you're more interested in, you can choose to focus on it.I've always loved both maths and computing, and doing these courses at Balliol is an excellent way to explore these subjects in great detail.What advice would you give prospective applicants for your course? If you can't decide whether to do maths or computer science, just do both! The combined course is great fun and gives you a broader knowledge than doing just one of the two.///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////> Tiffany Duneau What is the best thing about studying Computer Science & Philosophy at Balliol? Although I am the only person in my year studying Computer Science and Philosophy at Balliol I have not had any trouble meeting new people and forming close friendships with many people here, as well as being able to meet the others on my course that are at different colleges.(One great thing about being on such a small course overall, is that everyone knows each other).

On the flip side, that does mean that most people have never heard of the course at all, but getting past the 'but how exactly does your degree work together?' questions in freshers' week was definitely a worthwhile challenge, as I have thoroughly enjoyed spending time here at Balliol studying Computer Science and Philosophy.The tutors are really great and approachable, and do an amazing job of patching up our understanding when understanding the week's material didn't go that well, and the informal atmosphere avoids making things as awkward as you might expect, with two students face to face with a leading researcher.The best aspect of the course is probably that both halves are so different (and yet also very interconnected), so I can 'take a break' from trying to figure out how Oberon (an obscure programming language we use in second term) works, and instead question whether I would survive being teleported to Mars.What advice would you give prospective applicants for your course? The main thing I would recommend to do is to try reading about parts of philosophy that seem particularly interesting to you - I chose to focus on Turing since his work relates to both Computer Science and Philosophy.It's also worth trying your hand at some coding to get a feel for what parts of the course will be like.

I had never really studied philosophy before getting here, and didn't have a great deal of experience with programming, but that has not been an issue at all since no lectures or courses presume prior knowledge.///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////> Kai Laddiman Magdalen College I come from Romania where I studied at Colegiul National "Emil Racovita" Iasi.I took the Romanian Bacalaureate where I achieved 92.I knew I wanted to study Computer Science rather early on.

I think the most fantastic thing for me about the course is the focus on theoretical knowledge.There is nothing more exciting than being faced with a puzzle and finding a method to solve it.I am among those Computer Scientists who do not particularly like computers, as in the hardware side.For me it is enough to know that it is a black box which does what I tell it to do.

When it comes to Clubs, Societies and all, Oxford has it all.If it doesn't already have it, you will find enough people to make it.Personally I got involved in all the kind of societies.I ended up playing in a musical organized by the Oxford University Light Entertainment Society; went raiding collages late in the night with the Assassin's Guild; played various characters in the Oxford University RPG Role Playing Game Society games and ended up organizing next year's game; became the Oxford University Computer Society secretary; played Poohsticks on the bridges of Addison's Walk.

There are endless possibilities and the people here are so diverse it is impossible not to find people you enjoy yourself with (as long as you go searching).

It is as diverse as London (maybe even more diverse) and the high concentration of students makes it a rather quirky place.Do not be surprised by people in elf attires running around with plastic lightsabers *cough*.///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////> Chris Kew New College Chris Kew Before I came to Oxford, I was at Lord Williams's School, an academy school in Thame, England.I did Maths, Further Maths, Physics, and Chemistry over 3 years, since I was recovering from CFS/ME at the beginning of my A-levels.

I got A*s in everything but Chemistry, and an A in Chemistry.I chose computing during the process of writing my personal statement -- I wasn't terribly fixed on course (it was between Comp, Maths, and Physics), and basically chose comp because I had some experience programming, I was bad at experimental work, and I wasn't sure I'd like the level of abstraction in Maths.Now I am at Oxford I play non-contact ice hockey twice a week.As a non-skater before coming to the university, this took a bit of learning (and a few bruises.) I currently play drum in the ceilidh band, I'm also now band treasurer, and I've learned to play bridge.

///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////> Oriel College I'm German and took my Abitur at a Gymnasium in Germany.My Leistungskurse (advanced subjects) were English and Maths, and my third and fourth subjects were Computer Science and History.I wanted to study either Computer Science or Maths, however, I didn't really like programming.

I eventually decided to do Computer Science at Oxford, because the course is very theoretical, includes a lot of maths and there is a wide range of advanced choices in later years.The tutorial system is a really good way to deepen your understanding of the subjects.Whenever you need help, you can just ask the omniscient computer science wizard in your college and they will take the time to explain it to you.I really enjoyed the Digital Systems course, because it removed the abstraction of programming for me.The course works through how a computer is built starting at transistors and logic gates and eventually reaching a machine that can run code.

From what I heard, people are going into a wide range of sectors after graduating, including (of course) software development, but also finance, consulting and many more.I really haven't made my mind up what I want to do yet, but I think this course will be a great preparation for all of them.///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////> St John's College I previously attended Barton Peveril Sixth Form College in Hampshire, England, studying A-levels in Maths, Further Maths, Physics and Computing, as well as an Extended Project and Additional Further Maths, Music and Government & Politics at AS level.Oxford life is fantastically busy with challenging work (but when you enjoy the subject you usually don't mind); socialising (easy when everything and everyone in Oxford is unbelievably close) and taking part in the wealth of extra-curricular fun (currently I'm involved with an orchestra, Christian Union and the Invariants Maths society).As well as the amazing and diverse people and activities that now form my life, I have found the support, teaching and resources provided superb.

Josh Peaker University College I attended Pontefract New College, a state Sixth Form College in West Yorkshire, and achieved A* in Maths and Further Maths A Levels, and an A in Physics.I love the way Oxford chooses to teach Computer Science.Whereas other universities focus on just learning several languages, Oxford really focuses on WHY you use them, and teaches how to create computer programs, not in a specific language, as that's the easy part, but in the general ideas.Oxford's just a busy, hectic, beautiful, brilliant place to live.

There are a crazy amount of clubs and societies, for so many things, there's always something going on.Xavier Wilders St Anne's College I attended the Ermitage School of France, near Paris, where I took the OIB, a French Baccalaureate.I did not think I would get into Oxford.My school had never had a student go off to such a high-ranking university.I've been self-teaching computing since the age of 7.

I love the mix of a scientific subject in which creativity is strongly needed to solve problems.I wanted to study in England, and both Oxford's reputation and tutorial system attracted me.The best thing about Oxford is the tutorial system, and how easy it is to contact tutors.I would recommend the course to others, as long as they are ready to study CS from a theoretical point of view.Lukas Bosko Merton College I am from Slovakia where I studied for an International Baccalaureate, before moving to England to start my degree at Oxford.

The Department of Computer Science offers a unique place for people with broad range of interests and goals, whether in academia or in the private sector.Staff and professors are always supportive and the environment is very professional.All the courses are very well organised and the lecturers readily share all the course materials, which was a great advantage.The atmosphere of the place makes it just a wonderful place to study.

Maths and Computer Science is a degree that can lead into many disciplines.I quickly realised that thanks to optional courses, I will be able to tailor the course in 2nd and 3rd year completely to my needs.I would highly recommend the Maths and Computer Science course - especially to people who want to work in software houses, academia and finance, including investment banks.Catherine's College I went to Taunton's, a Sixth Form College in Southampton.

I studied Mathematics, Further Maths, Physics and Computing A-level, and Chemistry AS, and I received As in all those subjects.I chose to study Computer Science because it is a combination of mathematics and computers and has applications for many other fields as well.The Oxford course strikes a balance between theory and practice that appeals to me, although at the time I was deciding, I thought I'd be more interested in the practical side of things, I actually find that I prefer the theory now.The best thing about studying Computer Science at Oxford is the small class size allows for more two-way communication between student and tutor.I've especially enjoyed the Functional Programming and Principles of Programming Languages courses because they broadened the way I think about programs - chiefly, what they are and how they work.

If I could give a prospective student one piece of advice about coming to study Computer Science at Oxford, I'd say don't decide not to apply because you don't think you're good enough - you could be very surprised.Originally I thought I would not be likely to be accepted into Oxford, but I decided to apply anyway, mainly just to see if I could.A friend of mine said that it would be worth it if only so that he could frame the rejection letter! Even if I hadn't managed to get into Oxford, I definitely think I would have benefited just from getting the applications to all the other universities done early, so I had less to worry about down the line.Lubomir Atanassov Balliol College Before coming to Oxford I studied in my home country, Bulgaria.I attended a specialised mathematical high school.

Where I come from, the name 'Oxford' means a lot.I chose Mathematics and Computer Science as a subject because at high school level I did a lot of Maths, however I felt that there was more to the subject.Computer Science has always fascinated me.I have found that the best thing about studying at Oxford is the truly unique mixture of people.The facilities available to students and the support we get is amazing.

The classes at Oxford are small - and that offers a lot of advantages.The part of the course I most enjoy are the practical sessions, as they help me realise how the material we have covered in lectures could be implemented in practice.As well as studying, I am the president of the Bulgarian Society, and I am a keen rower.The highlight of my time at Oxford so far was during the Summer Eights (Oxford's biggest inter-college rowing race) when our boat won 'blades'.(Crews who perform particularly well are rewarded with the presentation of their oars – 'blades' – decorated with a record of their achievement.

) Hannah Thomas Worcester College I'm from a state school in Maidstone, Kent where I studied German, French, Maths and Further Maths for A level.I'd always been keen on maths and my interest in problem solving and languages led me to Computer Science.After attending an Open Day, I decided Oxford was the place for me: the tutors were so approachable and the course incredibly flexible in the ways you can combine the two subjects.It also did not assume any prior knowledge of Computer Science.Any initial apprehension I had quickly became unimportant, as it was easy to settle in.

I knew deep down this is what I wanted to do and where I wanted to be, in Oxford's beautiful surroundings with a chance to pursue my interests to a high level.Working in small groups with others who are passionate about the subject is a real benefit.The course has been challenging and exciting, as I have discovered new mathematical 'tools' and ideas and been given a chance to be creative, for example in producing rigorous proofs and algorithms.A lot of ideas now make more sense to me as I am studying topics in more depth.With the numerous organisations it's been easy to carry on the extra-curricular activities I enjoyed before.

I have the privilege of singing as a choral scholar in the 18th Century college chapel, conducting a college choir and playing in the orchestra.Just from being around college I've also been able to meet a wide range of people with different interests and backgrounds.Another perk to life in Oxford is the great facilities: the halls serving fantastic formal meals and en suite rooms are among these.The college also has a very safe environment making daily life away from home pleasant and easy to cope with.Anne-Marie Imafidon Keble College I went to the Latymer School in East London and took A-levels in Maths, Further Maths, French and ICT and AS-Levels in Further Additional Maths and Physics.

I chose Oxford because the course was an even split of both Computer Science and Maths from the beginning, without a bias towards either one.I chose Mathematics and Computer Science because I have always had a keen interest in both and appreciate that they have a very strong link, large areas of overlap, and ultimately mathematics underpins most of Computer Science.The course has been amazingly flexible and I have enjoyed the benefits of having the best of both worlds: the ability to choose any mathematical module or any computing module to suit my interests.I think Oxford is just the right kind of city to study in; it ensures that you can study without too many distractions, but have fun if you need to relax or unwind.

The highlights of the course have been studying Databases last term and finding out more about robots in the Intelligent Systems module.The variety of topics means you can always find a module that personally interests you.The highlights of being at Oxford, have been the various opportunities to work with really smart people on committees or socially, and also the great resources we have access to including our Careers Service and the Radcliffe Science Library.Lastly the tutorial system means you can get all the help you need and it keeps you on your toes! In college, I had a role on our JCR committee and I am a member of the Christian Union.I was also the Keble College rep serving on the Mathematics Undergraduate Representative Committee (MURC) and I played on the college netball team.

Dan Surman Oriel College I live in a town called Witney, about 20 minutes from Oxford.I attended The Henry Box School, one of the two local comprehensives, and achieved AAAB in maths, f.maths, chemistry, and physics respectively.I chose the course because I wanted to be taught the principles that underpinned programming languages, allowing me to easily learn new languages when required, as opposed to being taught specific details of individual languages, which I could learn myself.

The course is well structured and well taught, which is fortunate as I find it very demanding.

A particular highlight of the course so far is the Concurrent Programming lecture series which explains how to write programs, using different techniques, that correctly solve problems In the field of concurrency.As well as studying I also play football for the college first XI, badminton, basketball and pool.Tom Perry St John's College I grew up on the Wirral, near Liverpool, and studied at my local comprehensive where I took Maths, Further Maths, Physics and Biology at A level.By the time I started looking at further education my attention had shifted towards Computer Science.I chose Oxford because of its tutorial system and its outstanding reputation as a leading institute in Computer Science.

Due to my relative lack of knowledge about computers and programming the fact that the course started from basics was an advantage.The course has proved to be better than I had imagined as it is well structured, giving a good basis in Maths and programming, before broadening to allow you to pursue your fields of choice.Oxford provides all the benefits of a city university plus, because of the College system, there is a strong community spirit which can be very rewarding to be involved in.I have been involved in many activities included my college and university ballroom dancing team, Welfare Officer for my college, a member of various other societies and an eager supporter of all social activities.Overall I think Oxford provides the perfect student life and will leave me with many happy memories.