The School of Engineering, Computer Science and Mathematics website has been designed in line with modern Internet technologies (XHTML, CSS, DOM) but your browser does not appear to properly support some of the required features.

So that you are not at a disadvantage, you will still see all of the information content but in a less organised form. However, you will have a better browsing experience if you upgrade to a modern browser.

School HomeResearchPast Seminar informationPast Informatics seminars

Past Informatics Research Institute Seminars by Reverse Date

 


Informatics Research Institute Seminars (Historical)

Spatio-temporal GIS
Prof. Christophe Claramunt (Naval Academy Research Institute, Brest, France)
4 Jul 2011Harrison 170 Monday 3pmComputer Science
TBA

 

Decentralized spatial computing for geosensor networks, especially in movement analysis
Dr Patrick Laube (Department of Geography, University of Zurich)
4 Jul 2011Harrison 170 Monday 4pmComputer Science
TBA

 

Latent Force Models
Prof. Neil Lawrence (Department of Computer Science, University of Sheffield)
16 Mar 2011Harrison 215 Wednesday 2pmComputer Science
Physics based approaches to data modeling involve constructing anaccurate mechanistic model of data, often based on differentialequations. Machine learning approaches are typically datadriven-perhaps through regularized function approximation.These two approaches to data modeling are often seen as polaropposites, but in reality they are two different ends to a spectrum ofapproaches we might take.In this talk we introduce latent force models. Latent force models area new approach to data representation that model data through unknownforcing functions that drive differential equation models. Bytreating the unknown forcing functions with Gaussian process priors wecan create probabilistic models that exhibit particular physicalcharacteristics of interest, for example, in dynamical systemsresonance and inertia. This allows us to perform a synthesis of thedata driven and physical modeling paradigms. We will show applications of these models in systems biology and (given time) modelling of human motion capture data.

 

Differential Geometric MCMC Methods
Prof. Mark Girolami (Department of Statistical Science, University College London)
15 Mar 2011Harrison 209 Tuesday 3pmComputer Science
In recent years a reliance on MCMC methods has been developing as the “last resort” to perform inference over increasingly sophisticated statistical models used to describe complex phenomena. This presents a major challenge as issues surrounding correct and efficient MCMC-based statistical inference over such models are of growing importance. This talk will argue that differential geometry provides the tools required to develop MCMC sampling methods suitable for challenging statistical models. By defining appropriate Riemannian metric tensors and corresponding Levi-Civita manifold connections MCMC methods based on Langevin diffusions across the model manifold are developed. Furthermore proposal mechanisms which follow geodesic flows across the manifold will be presented. The optimality of these methods in terms of mixing time shall be discussed and the strengths (and weaknesses) of such methods will be experimentally assessed on a range of statistical models such as Log-Gaussian Cox Point Process models and Mixture Models. This talk is based on work that was presented as a Discussion Paper to the Royal Statistical Society and a dedicated website with Matlab codes is available at http://www.ucl.ac.uk/statistics/research/rmhmc

 

Metric Learning with Eigenvalue Optimization
Dr. Yiming Ying (Department of Computer Science, University of Exeter)
1 Mar 2011Harrison 209 Tuesday 3pmComputer Science (Internal)
In this talk I will mainly present a novel eigenvalue optimization framework for learning a Mahalnobis metric from data. Within this context, we introduce a novel metric learning approach called DML-Eigen which is shown to be equivalent to a well-known eigenvalue optimization problem called minimizing the maximal eigenvalue of a symmetric matrix. Moreover, we show that similar ideas can be extended to large margin nearest classifiers (LMNN) and maximum-margin matrix factorisation for collaborative filtering. This novel framework not only provides new insights into metric learning but also opens new avenues to the design of efficient metric learning algorithms. Indeed, first-order algorithms scalable to large datasets are developed and their convergence analysis will be discussed in detail. At last we show the competitiveness of our methods by various experiments on benchmark datasets. In particular, we report an encouraging result on a challenging face verification dataset called Labeled Faces in the Wild (LFW).

 

Discrete Mereotopology in automated histological image analysis
Dr. David A. Randell (Medical Imaging Research Group, College of Medical and Dental Sciences, University of Birmingham)
22 Feb 2011Harrison 209 Tuesday 3pmComputer Science
This cross-disciplinary talk covers the integration of qualitative spatial reasoning (QSR) with quantitative histological image processing methods using digitised images of stained tissue sections and other preparations examined under the microscope. The talk will show how the QSR spatial logic Discrete Mereotopology can be used to model and exploit topological properties of segmented images of cells and their parts and general tissue architecture. Relation sets and other mathematical structures extracted from the theory are factored out and used to complement and guide algorithmic-based segmentation methods. The net result is a change of emphasis away from classical pixel-based segmentation algorithms to one where the primary ontological primitives are regions and their spatial relationships each to the other. The work forms part of that done by the Medical Imaging Group with a long-standing interest in: image segmentation in histopathology, quantitative measures of tissue architecture and complex data characterisation and visualisation.

 

Automating the Heuristic Design Process
Dr. Matthew Hyde (ASAP Group, University of Nottingham)
26 Jan 2011Harrison 170 Wednesday 2pmComputer Science
The current state of the art in the development of search methodologies is focused around the design of bespoke systems, which are specifically tailored to a particular situation or organisation. Such bespoke systems are necessarily created by human experts, and so they are relatively expensive. Some of our research at Nottingham is concerned with how to build intelligent systems which are capable of automatically building new systems. In other words to automate some of the creative process, to make it less expensive by being less reliant on human expertise. In this talk, I will present some work we have recently published on the automatic design of heuristics for two dimensional stock cutting problems. The research shows that genetic programming can be used to evolve novel heuristics which are at least as good as human designed heuristics for this problem. Research into the automatic design of heuristics could represent a change in the role of the human expert, from designing a heuristic methodology, to designing a search space within which a good heuristic methodology is likely to exist. The computer then takes on the more tedious task of searching that space, while we can focus on the creative aspect of designing it.

 

Many Objective Optimisation of Engineering Problems
Dr. Evan J. Hughes (Department of Informatics and Sensors, University of Cranfield)
19 Jan 2011Harrison 170 Wednesday 3pmComputer Science
Most real engineering problems are characterised by having many criteria that are to be optimised simultaneously. Unfortunately the criteria are often conflicting and so have to be considered as a many-objective optimisation process in order to derive a trade-off surface of the available optima solutions. Although a plethora of algorithms have been developed for optimising two-objective problems, many of them do not work well as the number of objectives increase. The talk introduces some of the new algorithms that have been developed for investigating many-objective problems and describes how the methods have been used to advance the design of airborne fire-control and surveillance radars.

 

Various Formulations for Learning the Kernel and Structured Sparsity
Prof. Massimilino Pontil (Department of Computer Science, UCL)
1 Dec 2010Harrison 170 Wednesday 2pmComputer Science
The problem of learning a Mercer kernel is of central importance in the context of kernel-based methods, such as support vector machines, regularized least squares and many more. In this talk, I will review an approach to learning the kernel, which consists in minimizing a convex objective function over a prescribed set of kernel matrices. I will establish some important properties of this problem and present a reformulation of it from a feature space perspective. A well studied example covered by this setting is multiple kernel learning, in which the set of kernels is the convex hull of a finite set of basic kernels. I will discuss extensions of this setting to more complex kernel families, which involve additional constraints and a continuous parametrization. Some of these examples are motivated by multi-task learning and structured sparsity, which I will describe in some detail during the talk.

 

Analysis, synthesis and applications of gene regulatory network models
Prof. Yaochu Jin (Department of Computing, University of Surrey)
10 Nov 2010Harrison 107 Wednesday 2pmComputer Science
This talk starts with a brief introduction to computational models of gene regulatory networks (GRN), followed by a description of our recent results on analyzing and synthesizing gene regulatory motifs, particularly from the robustness and evolvability perspective. We show that in a feedforward Boolean network, the trade-off between robustness and evolvability cannot be resolved. In contrast, how that this trade-off can be resolved in an ODE-based GRN model for cellular growth based on a quantitative evolvability measure. In addition, we demonstrate that robust GRN motifs can emerge from in silico evolution without an explicit selection pressure on robustness. Our results also suggest that evolvability is evolvable without explicit selection.

 

An Ontology of Information and Information Bearers.
Dr. Antony Galton (Computer Science)
3 Nov 2010Harrison 170 Wednesday 2pmComputer Science (Internal)
In many areas, such as emergency management, coordinated action can be hampered by lack of suitable informatic support for integrating diverse types of information, in different formats, from a variety of sources, all of which may be relevant to the problem at hand. To create software that is able to handle such a diversity of information types in a unified framework it is necessary to understand what types of information there are, what forms they can take, and how they are related to each other and to other entities of concern. To this end, I am currently developing a formal ontology of information entities to serve as a reference point for subsequent system development activities. In this talk I will discuss some of the issues that I have had to address in developing the ontology.

 

Novel Machine Learning Methods for Data Integration
Dr. Colin Campbell (Intelligent System Lab, University of Bristol)
27 Oct 2010Harrison 170 Wednesday 2pmComputer Science
Substantial quantities of data are being generated within the biomedical sciences and the successful integration of different types of data remains an important challenge. We begin the talk with an overview of our motivation for our investigations in this context. We begin by briefly reviewing work on the joint unsupervised modeling of several types of data believed functionally linked such as microRNA and gene expression array data from the same cancer patient. Next we consider supervised learning and outline several approaches to multi-kernel learning which can handle disparate types of input data. We conclude with a discussion of future avenues for investigation in this context.

 

Rubberband Algorithms - A General Strategy for Efficient Solutions of Euclidean Shortest Path Problems
Prof. Reinhard Klette (University of Auckland, New Zealand)
20 Oct 2010Harrison 170 Wednesday 2pmComputer Science
Algorithms for solving shortest path problems in 2D or 3D Euclidean space are typically either solvable in linear time, or of higher order time complexities, often even NP-hard. Rubberband algorithms follow a general design strategy which is relatively simple to implement assuming a step set has been identified which contains the vertices of shortest paths, and is easily calculable itself. The talk informs about solutions to selected shortest path problems using rubberband algorithms.

 

Kent's talk: A hyper-heuristic approach to generating mutation operators with tailored distributions.

Richard's talk: A Bayesian Framework for Active Learning
Kent McClymont and Richard Fredlund (Computer Science)
13 Oct 2010Harrison 107 Wednesday 2pmComputer Science (Internal)
Kent's talk: Discussion on a method for generating new probability distributions tailored to specific problem classes for use in optimisation mutation operators. A range of bespoke operators with varying behaviours are created by evolving multi-modal Gaussian mixture model distributions. These automatically constructed operators are found to match the performance of a single tuned Gaussian distribution when compared using a (1+1) Evolution Strategy. In this study, the generated heuristics are shown to display a range of desirable characteristics for the DTLZ and WFG test problems; such as speed of convergence.

Richard's talk: We describe a Bayesian framework for active learning for non-separable data, which incorporates a query density to explicitly model how new data is to be sampled. The model makes no assumption of independence between queried data-points; rather it updates model parameters on the basis of both observations and how those observations were sampled. A "hypothetical" look-ahead is employed to evaluate expected cost in the next time-step. We show the efficacy of this algorithm on the probabilistic high-low game which is a non-separable generalisation of the separable high-low game introduced by Seung et al. (1993). Our results indicate that the active Bayes algorithm performs significantly better than passive learning even when the overlap region is wide, covering over 30% of the feature space.

 

Demand Management - Good or Bad?
David Evans (Water Resource Consultant)
29 Jun 2010Harrison 170 Tuesday 2pmInformatics RI
The UK’s population is predicted to rise by 20 million in coming decades, while the climate is expected to become wetter in winter and drier in summer. Much emphasis is therefore being put on demand management (or ‘water efficiency’) - but is this the right approach? Our climate is highly seasonal. In summer, evaporation exceeds rainfall and the incoming water resource is negative. But winter resources are large, and summer needs are met by storing winter surplus either naturally in the soil and as groundwater, or artificially in reservoirs. Wetter winters would increase that surplus (though perhaps offset by greater storminess). Groundwaters are fully committed, but water supply for the additional population can come from whatever is the best combination of demand management and surface water development. However, with climate change the predicted combination of less summer rain and higher evaporation spells summer desiccation. It is far from clear if there will be enough water to grow our food or to sustain our green environment. This could be a much bigger issue than water supply and the presentation explores it quantitatively. Fortunately water supply is non-consumptive – it almost all comes back, either as effluent or as leakage, and can be reused, for example for irrigation. Even coastal effluents are water resources available to reclaim when needed. The more the water supply, the more the returns. Conversely, demand management reduces the amount of water that the water companies store in winter and return in summer. The presentation will show how the multiple benefits of water storage greatly exceed those of too much demand management. The future is alarming for our food and green environment. Reservoirs will help; demand management won’t.

 

Flood frequency estimation in urban catchments
Dr Thomas Kjeldsen (Centre for Ecology & Hydrology, Wallingford)
22 Jun 2010Harrison 170 Tuesday 2pmInformatics RI
The presentation will introduce the industry-standard for flood frequency estimation in the UK, the Flood Estimation Handbook (FEH), and review ongoing activities in quantifying the impact of urbanisation on catchment flood response and flood frequency relationships. Catchment-scale effects of urbanisation are shown to be complex, and the study will highlight future data requirements needed to improve modelling capabilities in catchments where urban areas interact with the terrestrial water cycle.

 

The use of acoustic methods for sewer network management
Mr Richard Long (Richard Long Associates)
8 Jun 2010Harrison 170 Tuesday 2pmInformatics RI
The sewer system in the UK is ageing and is being put under increasing strain by climate change and ongoing development. To replace the systems would cost in excess of ?100billion and would be hugely disruptive, so careful management by the sewerage undertakers is vital. Yet they have only very limited information about the condition of their assets and how they are deteriorating over time. This need has been recognised by government and the companies themselves. Sewerage undertakers in the UK need more information, but existing CCTV technology is slow and it would be unaffordable to survey the whole sewer system on a regular basis this way. Anticipating a demand for new low-cost techniques, a team led by Bradford University has been collaborating to develop acoustic techniques to meet this need. By providing inexpensive, quick, accurate, digital output SewerBatt has the potential to allow the water industry and other industries with extensive drainage assets to meet the increasingly demanding requirements of government, regulators, customers and shareholders. The presentation will examine the need for new technology and describe the project undertaken. Examples of the output from the system will be provided and how it could be incorporated into sewerage management practice will be discussed.

 

Many Criteria Decision Making in Control and Systems Design
Prof. Peter Fleming (University of Sheffield)
25 May 2010Harrison 170 Tuesday 2pmInformatics RI
Design problems arising in control and systems can often be conveniently formulated as multi-criteria decision-making problems. However, these often comprise a relatively large number of criteria. Through close association with designers in industry a range of machine learning tools and associated techniques have been devised to address the special requirements of many-criteria decision-making. These include visualisation and analysis tools to aid the identification of features such as “hot-spots” and non-competing criteria, preference articulation techniques to assist in interrogating the search region of interest and methods to address the special computational demands (for example, convergence and diversity management) of these problems. Test problems and real design exercises will demonstrate these approaches.

 

Info-Gap Theory for Strategic Planning Under Severe Uncertainty: Applications to Pollution Control Policy
Prof. Yakov Ben-Haim (Technion - Israel Institute of Technology)
11 May 2010Harrison LT04 Tuesday 2pmInformatics RI
Info-gap theory is a method for analysis, planning, decision and design under uncertainty. The future may di®er from the past, so our models may err in ways we cannot know. Our data may lack evidence about surprises: catastrophes or windfalls. Our scienti¯c and technical understanding may be incomplete. These are info-gaps: incomplete understanding of the system being managed. Info-gap theory provides decision-support tools for modelling and managing severe uncertainty. After outlining the info-gap methodology, we explore applications to public policy for pollution control. Given uncertainty in the marginal costs and bene¯ts of pollution emission, is it better to impose a tax on pollution or to establish legal limits to pollution? Given uncertainty in the costs of abatement, how should a public regulator allocate auditing and enforcement resources?

 

Industrial Experiences and Applications of Artificial Intelligence (AI) in Real-Time Strategy Computer Games
Ingimar Gudmundsson (Creative Assembly)
7 May 2010Harrison 170 Friday 1pmInformatics RI
TBA

 

Decision making under risk: the optimization of storm sewer network design
Sun Si'Ao (University of Exeter)
4 May 2010Harrison 215 Tuesday 2pmInformatics RI
It is widely recognised that flood risk needs to be taken into account when designing a storm sewer network. Due to the stochastic character of the flood risk, comparisons between candidate networks are not straightforward. This study aims to explore the decision making in flood risk based storm sewer network design. It is viewed as an optimisation problem with the decision criterion determined by a subject judgement of the decision maker. Several decision criteria are introduced and applied to select an optimal design after a multi-objective optimisation. Different decisions are made according to different criteria as a result of different concerns represented by the criteria. Moreover, the problem can be formed as a single-objective optimisation if the decision criterion is provided a priori. The design using a single-objective optimisation is also studied, with the flood risk being evaluated under design storms or via sampling.

 

Hertfordshire Surface Water Management Plan
Mr Nathan Muggeridge (Mouchel)
13 Apr 2010Harrison 170 Tuesday 2pmInformatics RI
Defra awarded £9.7 million in October 2009 to allow 77 Councils to develop Surface Water Management Plans (SWMPs). One of these councils was Hertfordshire County Council and this presentation will provide an insight to the Strategy developed for the SWMP, tasks already completed and what is planned to be delivered. Presentation will cover risk, pluvial modelling, data collection and communication plan.

 

1D, 2D and 3D modelling of urban flooding
Prof. Slobodan Djordjevic (University of Exeter)
30 Mar 2010Harrison 170 Tuesday 2pmInformatics RI
Within the dual drainage framework, flooding can be modelled either as flow in a network of 1D (one-dimensional) open channels and ponds, or as a 2D flow with depth-averaged velocities, or as a 3D computational domain. Each of these approaches has its advantages and drawbacks, thus there is no single “best” choice of the dimensionality of surface flow. In addition, every model – be it 1D, 2D or 3D – involves specific problems related to: spatial resolution and generation of computational mesh, treatment of buildings and terrain features, sub-surface/surface interactions, how the rainfall variability and surface run-off are introduced, possibilities for model calibration and how the uncertainties in model parameters are handled, communication of results etc. Therefore we can only talk about an adequate approach because it very much depends on the extent of the area, quality of available data, purpose and type of the analysis i.e. the required number of off-line or real-time runs and other factors. The talk will address these issues through experiences from several ongoing urban flooding projects and by outlining some unresolved research questions and practical problems.

 

Applying real options and evolutionary optimisation methods to evaluate flood risk intervention strategies
Ms Michelle Woodward (University of Exeter / HR Wallingford)
23 Mar 2010Harrison 170 Tuesday 2pmInformatics RI
A framework has been developed to analyse optimum flood risk intervention strategies. Real Options Analysis is recognised as an appropriate technique for valuing flexibility in investment decisions and is now promoted by the treasury as being appropriate for flood and coastal erosion risk management. In particular, the ability of Real Options Analysis to assist in developing climate change adaptation strategies is well recognised. Moreover, the application of Real Options combined with evolutionary optimisation methods, and in particular multiobjective optimisation, provides the ability to generate long term flood risk intervention strategies. A computational framework will be described which assesses the most appropriate set of interventions to make in a flood system and the opportune time to make these interventions, given the future uncertainties. This framework captures the concepts of real options and employs an optimised decision framework to evaluate potential flood risk management opportunities across a range of future climate change and socio economic scenarios.

 

Rainfall-Runoff Simulation and Groundwater Recharge in Arid Regions
Prof. Moshen Sherif (College of Engineering, UAE University)
17 Mar 2010Harrison 203 Wednesday 3pmInformatics RI
In arid and semi-arid regions, rainfall events are limited and random. Extreme events are more frequent and, hence, detention and retention dams are usually built across the main wadis to intercept surface water runoff and recharge the groundwater systems. In this presentation, the focus is devoted to the simulation of surface water runoff and groundwater recharge due to water storage in the lakes of dams. HEC-HMS model is used to simulate the surface water runoff and water storage in the lakes of three main dams due to rainfall events in the northern area of the United Arab Emirates. Within the calibration process of HEC-HMS, the simulated water flow and storage in the dams were compared with the observed data for several storm events. Using the calibrated model, a family of rainfall-runoff/storage curves was developed based on the duration and intensity of rainfall events. These curves can be used for prediction of surface water runoff in the three wadis and water storage in the dams in response to different rainfall events. The groundwater recharge was simulated using MODFLOW. The model was calibrated and verified using different data sets and the results of groundwater levels were found to be in good agreement with the observed data. The model was also used to assess the increase of groundwater recharge due to the construction of dams. Significant amounts of the infiltrated water are retained in the unsaturated zone.

 

A Simulation-Optimisation Model to Study the Control of Saltwater Intrusion into Coastal aquifers
Mr Hany F. Abd-Elhamid (University oif Exeter)
16 Mar 2010Harrison 170 Tuesday 2pmInformatics RI
Seawater intrusion is one of the most serious environmental problems in many coastal regions all over the world. It is one of the processes that degrade water-quality by raising salinity to levels exceeding acceptable drinking water standards. Mixing a small quantity of seawater with groundwater makes it unsuitable for use and can result in abandonment of aquifers. Therefore, seawater intrusion should be prevented or at least controlled to protect groundwater resources. This work presents development and application of a simulation-optimization model to control seawater intrusion in coastal aquifers using different management models. The model is based on the integration of a genetic algorithm optimization technique and a coupled transient density-dependent finite element model, which has been developed for simulation of seawater intrusion. The management scenarios considered include abstraction of brackish water, recharge of fresh water and combination of abstraction and recharge. The objectives of these management scenarios include minimizing the total costs for construction and operation, minimizing salt concentrations in the aquifer and determining the optimal depth, location and abstraction/recharge rates for the wells. Also, a new methodology is presented to control seawater intrusion in coastal aquifers. In the proposed methodology ADR (abstraction, desalination and recharge), seawater intrusion is controlled by abstracting brackish water, desalinating it using small scale reverse osmosis plant and recharging to the aquifer. The developed model is applied to a number of case studies. The efficiencies of three different scenarios are examined and compared. The results show that all the three scenarios could be effective in controlling sea intrusion but using ADR methodology, results in the lowest cost and salt concentration in aquifers and maximum movement of the transition zone towards the sea. The application of ADR methodology appears to be more efficient and more practical, since it is a cost-effective method to control seawater intrusion in coastal aquifers. The developed model is an effective tool to control seawater intrusion in coastal aquifers and can be applied in areas where there is a risk of seawater intrusion. Finally, the developed simulation model is applied to study the effects of likely climate change and sea level rise on saltwater intrusion in coastal aquifers.

 

Community Resilience to Extreme Weather (CREW)
Dr Albert Chen (University of Exeter)
9 Mar 2010Harrison 170 Tuesday 2pmInformatics RI
'Community Resilience to Extreme Weather' (CREW) is an EPSRC-funded research project, established to develop a set of tools for improving the capacity for resilience of local communities to the impacts of future extreme weather events. CREW focuses on understanding the probability of current and future extreme weather events and their likely socio-economic impacts. Initiatives, such as the Stern Review, provide high-level socio-economic impacts but do not provide the sub-regional or local estimates pertinent at the community and individual scale. Therefore, the CREW consortium is investigating impacts at the local level (on householders, SMEs and local policy/decision makers). The research is also investigating the opportunities and limitations for local communities' adaptive capacity. CREW, using five South East London boroughs as case studies, is considering the decision making processes across communities including impediments and drivers of change. A web-based portal will provide a facility for presenting probable extreme weather events for a range of scenarios, and for presenting and evaluating coping mechanisms. Dr Slobodan Djordjevic and Dr Albert Chen at the CWS are involving with the Programme Package SWERVE (Severe Weather Events Risk and Vulnerability Estimators) of CREW for urban pluvial flood modelling. Albert Chen is going to introduce the CREW project and explain the details of modelling that accounts for the impact of future climate scenarios.

 

Neptune: Risk-Based Decision Support for Water Distribution System Incident Management
Mr Josef Bicik and Dr Mark Morley (University of Exeter)
2 Mar 2010Harrison 170 Tuesday 2pmInformatics RI
Project Neptune aims to develop an integrated, risk-based Decision Support System (DSS) to facilitate tactical (near real-time) and strategic decision making. This system should inform network operators and permit the rapid investigation, evaluation rectification of network failure events. In so doing, the Neptune DSS seeks to reduce the impact on customers, assist water companies in meeting regulatory requirements and to minimize the environmental and economic impact of incidents. This talk describes some of the work undertaken within the Centre for Water Systems on developing the risk-based methodologies and software components that make up the Neptune DSS. It concludes with a live demonstration of the Decision Support System on a number of case studies.

 

Optimisation on emergency scheduling of the raw water supply system in Zhuhai
Qi Wang (Tsinghua University, China)
23 Feb 2010Harrison 170 Tuesday 2pmInformatics RI
For a water supply system near river estuary, the time length of taking river water is affected by the saline tide period. Therefore, in order to increase the security and reliability of the water supply system, it is necessary to make reasonable scheduling, which aims to ensure reservoirs volume within the system to meet the water demand during the saline intrusion period. In this research, a mathematical model on the water supply system with multi-source near the estuary was established. A genetic algorithm was conducted to calculate the water level control lines of reservoirs, in which both the power consumption and system security were considered in the objective function. Compared with historical operating data, it shows that an optimal hydrograph obtained using the proposed method can significantly improve the security of water supply system and reduce the operation energy cost.

 

The confidence to build - Some thoughts on engineering software
Prof. Bill Harvey
26 Jan 2010Harrison 170 Tuesday 2pmInformatics RI
Engineering software began about 1950 with some of the earliest available digital computers. It became possible to analyse complex structures. Since then, the poawer of computers has grown at an almost frightening rate. Endless bells and whistles have been added to analytical software. In recent years ther have been attempts to couple CAD programs to analysis and call the package design. But engineering design is not about deciding on a geometry then asking the computer where the forces go. Real design is about deciding where you want the forces to flow and then arranging the geometry to make that happen. The same thing is true in CFD but that is not where I work. It is long past time that we realised and released the power of computers to help with design insteead of merely providing analytical results. That will involve fundamental changes in our vision of what we want to do. Bill's work is chiefly in structural assessment and in this field, surprisingly, the problem is greater. When designing a new building, if the analysis says this bit is too weak, a stroke of a pen (or mouse) is enough to make it stronger. If the structure is already there we often need to know where the forces might really go rather than where they could go if they need to. The seminar will cover some of the issues described above in the light of the sort of exploratory analysis Bill uses for his assessment work.

 

Integrated Water Management – Experience from Australia and GHD’s Innovative IWM Toolkit
Mr Mike Jones (Head of Water, UK, GHD)
19 Jan 2010Harrison 170 Tuesday 2pmInformatics RI
Integrated Water Management (IWM) is a strategy that draws together all facets of the water cycle including water supply, sewage and stormwater management to achieve beneficial social, environmental and economic outcomes. While the application of IWM principles to the UK water industry is in its infancy, a wide range of IWM strategies have been used successfully in communities throughout Australia. We will present an overview of IWM in Australia. A number of case studies will be discussed that demonstrate the range of IWM strategies being developed in Australia along with some lessons learnt from the Australian experience. GHD’s IWM Toolkit will also be presented. The IWM Toolkit is an innovative water balance modelling software tool that simulates complex integrated water servicing scenarios to assist water system planners in identifying the preferred water servicing strategy from a range of options.

 

Coupled water environmental model and system dynamics (SYDWEM) of integrated population-economy-water-river system in a rapidly urbanising catchment
Dr Hua-Peng Qin (University of Exeter)
8 Dec 2009Harrison 170 Tuesday 2pmInformatics RI
The rapidly urbanizing catchments in developing countries are usually faced with water quantity shortage and quality deterioration because water infrastructure development lags behind the demand of rapid population and economic growth. Although the existing environmental models can individually describe socioeconomic, water infrastructure and natural water systems, they cannot effectively capture the interactions among them. Taking rapidly urbanizing Shenzhen river catchment in China as an example, we developed a system dynamic based approach (SYDWEM) to couple water environmental models of population-economic, water infrastructure and river system in the catchment. The approach was verified to have the ability to simulate relationship between social, economic, water resource and effluent discharge issues as well as pollutant behaviour in the river. The approach was further applied to predict GDP and population growth, water balance and water quality variation in the catchment scale under proposed socio-economic policies (e.g. industrial structure regulation, water conservation) and engineering measures (e.g. wastewater treatment and reuse). By comparing the effects and sensitivities of proposed polices and measures, integrated management strategy was proposed to harmonize the socio-economy and eco-environment development in the catchment. The results indicated that SYDWEM provides a flexible decision making tool for integrated water management in urbanizing catchments.

 

Towards the sustainable city? Principles and practice
Prof. David Butler (University of Exeter)
1 Dec 2009Harrison 170 Tuesday 2pmInformatics RI
The presentaton asks the question 'what is a sustainable city' and 'would we know it when we saw it'? The aim of the presentation is not to headline gold standard exemplars, but to evaluate the reality of what sustainability means on the ground for large-scale housing projects. It describes and discusses the attempts of the government of England and Wales to take the principles embedded at policy level and to begin to roll out new ‘sustainable’ developments in practice, with specific reference to water management. It concludes with another question 'Is the journey towards sustainability more important than the goal'?

 

Talking Heads: Creating Realistic 2D & 3D Facial Animations
Dr Paul Rosin (Cardiff University)
24 Nov 2009Harrison 170 Tuesday 2pmInformatics RI
This talk will describe ongoing work at Cardiff University for building photo-realistic models of faces using active appearance models (AAMs) applied to both 2D image data and textured 3D mesh data. We have applied these models in a variety of contexts: 1/ speech driven animation, in which a combined audio and image model is built and new unseen audio is used to synthesise appropriate an image sequence, 2/ performance-driven animation, in which the animation parameters analysed from a video performance of one person are used to animate the facial model of another person, 3/ production of stimuli for psychological experiments to determine the human perception and judgement of the facial dynamics of smiles, and 4/ biometrics, in which people are identified based on their facial dynamics captured during an utterance.

 

Environmental implications of water efficiency measures in buildings
Abdi Fidar (University of Exeter)
10 Nov 2009Harrison 170 Tuesday 2pmInformatics RI
To encourage the efficient and sustainable use of resources (e.g. energy, water, construction materials) in England and Northern Ireland, the government has introduced ‘Code for Sustainable Homes’. The code requires reduction in per capita water consumption in households, and accordingly provides a rating scheme. Water efficiency related targets can be met using a range of water efficient microcomponents. However, very little is known about their environmental and energy related implications. This paper describes the development of a strategy to quantify the water and energy use of microcomponents and evaluate the environmental performance of composite strategies (comprising different combinations of water efficient microcomponents). A multi-objective optimisation based simulation tool has been developed to investigate the impacts of the composite strategies. Preliminary findings indicate that, depending on the objective (i.e whether to reduce water consumption or greenhouse gas emissions), a trade off is required. For example the analysis of results suggests that (a) energy use and the associated greenhouse gas emissions are largely dependent on how a given volume of water is contributed by the different microcomponents (b) the influence of dishwashers on water consumption reduces significantly when water efficient kitchen taps are present (c) baths do not necessarily use more water than showers (d) in most cases, for a given water consumption, the energy use and the greenhouse gas emissions are inversely proportional to the toilet flush volume.

 

Unsaturated Soil Mechanics: Expansive Soils & Soil Treatment
Prof Farimah Masrouri and Dr Olivier Cuisinier (LAEGO-ENSG-INPL, France)
5 Nov 2009Harrison 170 Thursday 2pmInformatics RI
Part I: Expansive Soils Shrinkage and swelling of clayey soils are responsible for a large amount of building damages and give rise to number of questions. This presentation is focused on 3 different points of a research project carried out to contribute to the comprehension of physical mechanisms of this natural hazard and its consequences on lightly-loaded structures. Part II: Stabilisation with lime Quicklime addition is a common technique to improve the physical properties of fine soils and is also known to significantly reduce the swelling ability of expansive soils. One of the main concerns with this practice is the permanence of the stabilization effects brought by the lime addition. This part of the presentation will be focused on a research project that intends to asses the impact of water circulation on the long term behaviour of lime-stabilised soils.

 

Local Water Symbiosis Approach to More Sustainable Urban Water Management
Dr Sara Moslemi Zadeh (University of Birmingham)
27 Oct 2009Harrison 170 Tuesday 2pmInformatics RI
In 21st century, stress on water resources is reaching critical levels due to population growth, rapid urbanization, economic development, climate change, and an ageing infrastructure. An Integrated Water Resources Management approach is urgently needed to secure the equitable and more sustainable management of freshwater to meet environmental, economic and social needs. Greywater treatment and its subsequent use for toilet flushing have been explored as more sustainable water resources management options. However, the infrastructure needs and the disinfectant required for greywater systems make it very difficult to see these systems as environmentally friendly and cost effect (unsustainable), especially for individual households. This is the reason for low uptake of greywater recycling system, particularly in UK. The aim of this project is to test the hypothesis that transferring the concept of industrial symbiosis from industries to urban areas may improve the sustainability of urban water management. The research focuses on greywater reuse among users in residential and office buildings in a local area. The local symbiosis is designed in 3 stages: first, calculating a potential water balance through water reuse and recycling options, including greywater generation; second, identifying or estimating sharing potentials in the area by considering the qualities of the water available for reuse; and finally calculating the optimal scale and mix of users as well as identifying any barriers to implementation. By minimizing the potable water usage, the environmental effect and the cost of the system are reduced.

 

Diagnostic Assessment of Search Controls and Failure Modes in Many-objective Evolutionary Optimisation
Prof. Patrick Reed (Penn State University, USA)
20 Oct 2009Harrison 170 Tuesday 2pmInformatics RI
The growing popularity of multiobjective evolutionary algorithms (MOEAs) for solving many-objective real world problems, where search failures can have actual economic costs, warrants the careful investigation of their search controls and failure modes. This study contributes a detailed statistical assessment of the search controls and failure modes for a broad range of MOEAs as well as a novel measure of controllability. The comparative analysis applies ten state-of-the-art MOEAs on the DTLZ scalable test problems for 2-7 objectives. From these results, we quantitatively compare the effectiveness and controllability of the algorithms. The study concludes by providing guidance on the top performing algorithms' non-separable, multiparameter controls when performing many-objective search.

 

A Framework for Supporting Rainwater Harvesting in the UK: addressing system design and implementation deficits
Sarah Ward (University of Exeter)
13 Oct 2009Harrison 170 Tuesday 2pmInformatics RI
A broad range of recent policy vehicles, such as the Code for Sustainable Homes, Future Water and the Draft Flood and Water Management Bill, are placing increased emphasis on the incorporation of sustainable urban drainage systems (SUDS) within new developments (and, to a lesser extent at present, retrofitting to existing developments). This is in response to both potable water demand reduction and surface runoff source control drivers. Rainwater harvesting (RWH) is one such SUDS technique that can both supplement mains water and attenuate stormwater flows. Putting aside debates on the cost-benefit and life-cycle analysis of such systems, the UK is significantly behind other countries, such as Germany and Australia, in implementing RWH. Although the previously mentioned policies promote the use of RWH, there is at present limited enabling of stakeholders (householders, businesses, schools) to implement RWH. This presentation outlines research that has utilised an interdisciplinary approach, undertaking both ‘hard’ and ‘soft’ scientific methods, to establish technical and stakeholder evidence bases related to RWH. Analysis of these evidence bases has resulted in the identification of a number of technical and social ‘implementation deficits’, which hinder the increased uptake of RWH in the UK. Key messages from the evidence bases are derived, resulting in recommendations for restructuring current support mechanisms, having implications for both policy makers and technical innovators.

 

New type of soil reinforcements
Prof. Meng Xi (Shanghai University)
1 Oct 2009Harrison 170 Tuesday 2pmInformatics RI
tba

 

"Why the Beauty of your Neighbour's Garden may Belong to You" or "How should we Pay for Urban Water"
Jim Robinson (University of Waterloo, Canada)
6 Aug 2009Harrison 170 Thursday 2pmInformatics RI
Many water utilities experience peak summer demands which are very expensive to satisfy, and providing for them can only be achieved at a cost many times per liter what is being charged currently for water. Some utilities are quite interested in using tariff structures, partly because they improve equity by better reflecting real costs of providing for peak demands. Some modelling has shown that 65-85% of customers would end up with lower water bills if summer use rates were implemented. However, utility staff feel such rate structures are a lot of work to implement just to improve equity and would be much more interested if there were strong evidence that peak demands will in fact be lowered, and this utility capital spending could be averted. Many utilities would like evidence about effects of peak demand rates to be gathered but would prefer someone else do it. Part of this is nervousness over the public relations issues associated with rate changes. A few brave water companies in the UK have started to collect that evidence, some using innovative experimental techniques. In one 5000 household trial in the UK, a brand new customer or a change of name on an account leads to installation of an automated meter and data logger collecting meter readings every 30 minutes, and assignment to one of five charging regimes all of which are revenue neutral to an average customer. Other utilities have suggested compensating trial participants up front to be more certain that no customer is disadvantaged. The foci of this presentation are to show ways that peak demand pricing trials have been and can be implemented, and to discuss some tentative results and their implications.

 

Reinforcement of weak soils: Are we utilising the reinforcement to get the best outcomes?
Dr Mostafa Mohamed (University of Bradford)
2 Jun 2009Harrison 170 Tuesday 2pmInformatics RI
In recent years, there has been growing usage of geosynthetic materials for fulfilment of a wide range of functions including improvement of the shear strength of weak soils in order for civil engineering projects to be constructed without the fear of long-term serviceability problems. Reinforcement of surface soils is also used to overcome potential negative consequences for construction in areas that are featured by the inclusion of pockets of soft soils. In general, the behaviour of the reinforced ground relies on the frictional resistance between the reinforcement and surrounding soils. In most of the cases there has been over use of reinforcing layers that never utilises the tensile strength of the reinforcing elements to their utmost. Furthermore, to speed up the construction process, reinforcement is often arranged in a simple manner. The presentation will review the current practice for soil reinforcement highlighting various failure mechanisms. Then it will focus in presenting a new approach to reinforcing the surface soils in which a nominal amount of soil is wrapped around with a layer of geotextile in order to create a wrapped cushion underneath the surface footing. The proposed reinforcement approach shows that significant improvements could be achieved. The presentation will show that the new reinforcement arrangement has the potential to improve the bearing capacity of surface footing several times and to reduce associated settlement. The improvements achieved by the proposed approach are greater than what it could be obtained by the conventional soil reinforcement. Cases in which pockets of soft soils exist will be discussed. Finally, an outline will be given of recent geotechnical engineering research undertaken at the University of Bradford.

 

Metals at the tap – how did we get it so wrong and what next?
Dr Colin Hayes (School of Engineering, Swansea University)
5 May 2009Harrison 170 Tuesday 2pmInformatics RI
The metals at the tap of greatest concern are lead, copper and nickel, all of which have health concerns (particularly lead). These metals mainly arise from pipes and fittings in the domestic water supply system. Despite apparent EU requirements to mitigate such issues, sampling problems have conspired to substantially diminish corrective action, the big exception being the UK , where 95% of public water supplies are dosed with ortho-phosphate to suppress plumbosolvency. As sampling deficiencies are rectified, the true scale of problems will emerge. An initial assessment is that 25% of the EU population could be at risk from lead in drinking water. Copper problems appear to be insignificant, whereas nickel problems may also be significant, subject to any future changes in the health based standard. International research networking has potentially influenced the revision of the EU Drinking Water Directive and the implementation of the WHO/UN Protocol on Water and Health. Metals at the tap appear set to climb the international policy agenda. A zonal lead emission model has been developed at Swansea University that can predict compliance with lead standards across an entire City or Town. This is based on a Monte Carlo probabilistic framework and has been validated successfully in numerous case studies. It has been used to optimise phosphate dosing and is currently being used to investigate health risks.

 

A Semantic Theory of the Interpretation of a Vague Language
Dr Brandon Bennett (University of Leeds)
26 Mar 2009Harrison 170 Tuesday 3pmInformatics RI
I present a semantics for the interpretation of a language that includes vague predicates, based on a refinement and extension of the "supervaluation" approach. The proposed theory provides a formal characterisation of the space of possible precise interpretations (precisifications) of the language, in terms of parameters that specify the applicability of vague concepts by means of thresholds on the values of observable measurements. These observables also determine a set of possible states of the world. Thus the truth of a proposition depends on both the possible world and the precisification with respect to which it is evaluated. On the basis of this semantics, the acceptability of a proposition to an agent is characterised in terms of the agent's beliefs about the world and attitude to admissible interpretations of vague predicates. The theory is applied to analysing certain aspects of the cognitive evaluation of vague information --- in particular the sorites paradox. A further extension, in which probability spaces are defined over the sets of possible worlds and precisifications, is used to give a statistical measure of the acceptability of a proposition to an agent.

 

Systematic decision analysis for flood risk management
Hamish Harvey (Newcastle University)
17 Mar 2009Harrison 170 Tuesday 3pmInformatics RI
Continuing rapid growth in the availability of data and the plummeting cost of computation represents both a great opportunity and a challenge. How can this abundance be utilised to improve the quality of decision making in water management? This opportunity has been grasped in the use of risk analysis in strategic decision making for flood risk management. Risk analysis represents only the first step in a process of development, however, the logical conclusion of which is systematic, quantitative decision analysis. Decision analysis techniques allow the merits of sets of management options to be examined, according to a variety of performance metrics and under a range of possible future conditions. I will outline a conceptual framework for such analyses and discuss the initial design of a software tool that implements it. Our focus is strategic decision making, in which option performance must be assessed over long time horizons. This introduces the need to represent processes of change, for example in demographics or climate, which may be highly uncertain. Early applications of risk analysis have demonstrated some of the opportunity of computational experiments in informing decisions. They have also exposed some aspects of the challenge: the results of these analyses are often opaque and difficult to trust. Properly managed and with the use of appropriate tools, however, the process of setting up and running a decision analysis can lead not only to greater insight but also to improved transparency. Our work sets out to bridge this gap between present reality and future potential.

 

Multi-layered approach to generalising Digital Terrain Models for 2D Flood Modelling
Mr Barry Evans (University of Exeter)
10 Mar 2009Harrison 170 Tuesday 3pmInformatics RI
In surface flow path flood modelling, 2D models are the preferred choices as they can simulate surface flow more accurately than 1D models, they are however more computationally demanding and thereby require greater simulation time. One approach that can be used to reduce the computational time required in 2D modelling is to utilise coarser resolution of data by down-sampling the original data-set. Yu and Lane (2006a) showed however that even relatively small changes in model resolution have considerable effects on the predicted inundation extent and the timing of flooding. One of the key causes of changes in the coarse resolution in an urban environment is that of the spreading of buildings into their surroundings and subsequent loss key details such alleyways. This seminar will outline the potential of utilising a multi-layered approach for the representation of buildings within a generalised terrain model and the automata processes involved in converting Digital Terrain Model data into a multiple layer generalised grid.

 

Adapting the strategic management of water systems to climate (and other) change - the need for new methods for analysing uncertainty and making robust risk-based decisions
Peter von Lany (Principal Consultant, Halcrow Group Ltd)
3 Mar 2009Harrison 170 Tuesday 3pmInformatics RI
Much attention is focused on mitigating factors that contribute to climate change. Mitigation needs to be integrated with actions to adapt to climate (and other) change in ways which do not exacerbate the causes or effects of climate change nor limit the ability of other sectors to adapt. Decision making processes such as the UKCIP Adaptation Wizard (see www.ukcip.org.uk) signal the need for risk-based decision making techniques and methods for analysing uncertainty to help: o determine aspects of water system(s) that are vulnerable to the effects of climate (and other) change o assess the potential risks to the system(s) and the opportunities arising from climate (and other) changes, and their implied costs and benefits o develop adaptive responses to these risks and opportunities to maintain or enhance in a sustainable manner aspects of the system that are most valued. The seminar will outline some recent work that the presenter has been involved in, in the strategic planning of flood risk management and water resources. It will consider the need for new techniques for risk-based decision making and dealing with uncertainty, and identify, for discussion, some of the challenges faced and possible ways forward to dealing with these issues.

 

A Hybrid Generative/Discriminative Framework to Train a Semantic Parser from an Un-annotated Corpus
Dr Yulan He (University of Exeter)
24 Feb 2009Harrison 170 Tuesday 3pmInformatics RI
We propose a hybrid generative/discriminative framework for semantic parsing which combines the hidden vector state (HVS) model and the hidden Markov support vector machines (HM-SVMs). The HVS model is an extension of the basic discrete Markov model in which context is encoded as a stack-oriented state vector. The HM-SVMs combine the advantages of the hidden Markov models and the support vector machines. By employing a modified K-means clustering method, a small set of most representative sentences can be automatically selected from an un-annotated corpus. These sentences together with their abstract annotations are used to train a HVS model which could be subsequently applied on the whole corpus to generate semantic parsing results. A filtering method is then used to select the most confident semantic parsing results to generate a fully-annotated corpus which is used to train the HM-SVMs. The proposed framework has been tested on the DARPA Communicator Data. Experimental results show that an improvement over the baseline HVS parser has been observed using the hybrid framework. When compared with the HM-SVMs trained from the fullyannotated corpus, the hybrid framework gave a comparable performance with only a small set of lightly annotated sentences.

 

Statistical challenges within the water industry
Dr Tim Watson (MWH - Business Solutions)
17 Feb 2009Harrison 170 Tuesday 3pmInformatics RI
The use of statistics is becoming more important within the water industry as a means of developing models that aim to improve the understanding of both the present performance of assets as well as forecasting the future performance of assets. These models, and related forecasts, are typically used to determine short and long-term investment requirements by use of some form of constrained optimization. The accurate estimation of the deterioration and performance of assets over time should be seen as a critical step in the efficient and sustainable allocation of investment spend and ongoing profitability. The modelling process, and the models themselves, therefore need to contain a large helping of statistical rigor and be fit for purpose, to ensure investment plans are robust and as accurate as the data supports. In this presentation, we will present some of the challenges that the water industry face to achieve the above with the constraints of ‘real world’ problems. These challenges are diverse in nature and relate to three key areas, resources, data, and statistical rigor.

 

System Dynamics Modelling for the simulation of complex hydrological/ water management systems
Dr Lydia Vamvakeridou-Lyroudia (University of Exeter)
10 Feb 2009Harrison 170 Tuesday 3pmInformatics RI
System Dynamics Modelling is a methodology for studying and managing complex feedback systems, typically used when formal analytical models do not exist, but system simulation can be developed by linking a number of feedback mechanisms. They are particularly useful for building, understanding and presenting models to non-engineers. This seminar presents the procedure for developing conceptual and System Dynamic Modelling in participatory interdisciplinary context, for a complex hydrological/water management system, the Merguellil valley catchment in Tunisia. The model is being used for studying various water management scenarios for 35 small and one large dam in the Merguellil valley, and their impact to aquifer recharge, under different climatic conditions.

 

A hidden semi-Markov model for recurrent events
Theo Economou (University of Exeter)
3 Feb 2009Harrison 170 Tuesday 3pmInformatics RI
There is a large and rich literature on statistical modelling recurrent events found mainly in the survival and reliability context. Typically the way in which this is done is by assuming a counting process such as the non-homogeneous Poisson process (NHPP) with a time dependent intensity function (occurrence rate). Sometimes it may be appropriate to use a hidden Markov process formulation to account for the fact that in longitudinal data such as recurrent events data, the subjects in the study undergo changes in time which could possibly depend on the events themselves. In addition it is more appropriate to assume a hidden semi-Markov instead of a Markov process since the latter assumes that the times between each state are geometrically distributed and we believe that this is an unrealistic assumption. The semi-Markov process allows the incorporation of a temporal structure thus increasing flexibility. The hidden semi-Markov model is implemented using MCMC on some illustrative data sets including river floods and water pipe bursts.

 

Water Resources Management: The Myth, The Wicked, & The Future
Prof. Patrick M. Reed (The Pennsylvania State University, USA)
27 Jan 2009Harrison 170 Tuesday 2pmInformatics RI
There is a growing recognition that we must better account for a broader range of process couplings as well as their associated uncertainties to manage the evolving human-natural systems that shape our water resources. These challenges are confounded by and potentially conflict with increasing expectations that engineers facilitate public participation in infrastructure design decisions. Water resources management has long considered these types of challenges. Many valuable historical perspectives have been somewhat lost in the recent water resources management literature. In this talk, I will argue that we are neglecting two key historical lessons as engineers: (1) least-cost optimality in complex human-natural systems is a myth and (2) that water resources management is reflective of a class of wicked social value problems where uncertainties, diverse perspectives, and couplings require collaborative formulation and evaluation of a broad range of design aspirations. Future water resources systems management paradigms must acknowledge wealth—risk tradeoffs, their associated uncertainties, and the potential consequences of limitations in our knowledge of the “chains of causality” operating in complex water systems (will A cause B?). Simultaneous consideration of these factors motivates the need for new frameworks for “constructive” decision-aiding. Decision-aiding can be viewed as a process of collaborative learning and negotiation that can exploit many objective analysis to identify alternatives that capture a broad suite of system behaviors relevant to both modeled and unmodeled objectives, helping decision makers to discover system dependencies and/or tradeoffs and exploit this information in the negotiated management of complex water resources systems. An example of constructive many-objective decision aiding will be provided for an urban water supply portfolio planning case study for the Lower Rio Grande Valley located in Texas, USA.

 

Using Interaction information for clustering variables in causal Bayesian networks
Michael Barclay (University of Exeter)
13 Jan 2009Harrison 170 Tuesday 2pmInformatics RI
Mutual information and conditional mutual information functions have been used to learn the structure of variants of naive Bayesian classifiers for several years. Related combinations of these functions have also been used to select relevant but not redundant feature sets. In the case of a causal Bayesian classifier, where the classification variable is not a common parent of the relevant features and where as a consequence hidden variables are often needed, the interaction information function can be shown to be a required, but not sufficient, method for appropriate clustering of variables into parent sets. Several unresolved problems in constructing a final network from the variable clusters are presented for discussion.

 

Integrating Genetic Algorithms (GA) with a water system simulator for the multiobjective optimisation of reservoir operation
Lydia Vamvakeridou-Lyroudia
9 Dec 2008Harrison 170 Tuesday 2pmInformatics RI
AQUATOR® is a commercial software for developing and running simulation models of natural rivers, water resources and water supply systems, using different operational rules, constraints and priorities. Developed by Oxford Scientific Software, it is being used by several water companies in the UK. The Centre for Water Systems has undertaken the task of linking AQUATOR to a GA optimisation module. Initially GANetXL, an add-in for Microsoft Excel®, developed by the Centre for Water Systems, was linked to AQUATOR, for the multiobjective optimisation of reservoir operation. However due to the excessive computational time required, AQUATOR-GA, a new GA application was developed, using distributed computing, which has been integrated within the AQUATOR environment. This seminar presents a comprehensive overview of the project comprising the mathematical approach and structure of the multiobjective optimisation problem, the development of a prototype as an Excel add-in, and the distributed AQUATOR-GA module that followed.

 

Multiobjective Neural Network Ensembles based on Regularized Negative Correlation Learning
Huanhuan Chen (University of Birmingham)
2 Dec 2008Harrison 170 Tuesday 2pmInformatics RI
Negative Correlation Learning (NCL) is a neural network ensemble learning algorithm which introduces a correlation penalty term into the cost function of each individual network so that each neural network minimizes its mean-square-error (MSE) together with the correlation. This paper analyzes NCL in depth and observes that the NCL corresponds to training the entire ensemble as a single learning machine that only minimizes the MSE error without regularization. This insight explains that NCL is prone to overfitting the noise in the training set. The paper analyzes this problem and proposes the multiobjective regularized negative correlation learning (MRNCL) algorithm which incorporates an additional regularization term for the ensemble and uses the evolutionary multiobjective algorithm to design ensembles. In MRNCL, we define the crossover and mutation operators and adopt nondominated sorting algorithm with fitness sharing and linear rank-based fitness assignment. The experiments on synthetic data as well as real-world data sets demonstrate that MRNCL achieves better performance than NCL, especially when the noise level is non-trivial in the data set. In the experimental discussion, we give three reasons why the performance of our algorithm outperforms others.

 

Tearfund’s Impact on MDG 7
Frank Greaves (Programme Development Adviser for Water & Sanitation at Tearfund)
25 Nov 2008Harrison 209 Tuesday 2pmInformatics RI
Tearfund is a Christian relief and development organisation, working in 40 countries worldwide through a network of some 350 partners, and through its own operational arm (the Disaster Management Team). Tearfund’s vision is to empower the local church to bring about transformational change to poor communities. It views the need for water & sanitation as a fundamental cornerstone in development, and acknowledges the effect that access to safe and sufficient water and sanitation has on livelihoods, economic well-being, health, and education. In terms of Watsan, Tearfund aspires to work with the church in making the greatest impact possible on achieving Millennium Development Goal 7, Target 10 (“By 2015, to reduce by half the proportion of people without sustainable access to safe drinking water and sanitation”). Our campaign for next year (2009) is that 6 million people will benefit from hygiene education through Tearfund programmes, and of these, by 2015, 3 million people will have improved access to safe water and sanitation facilities (ideally within 500 metres of a water source and 50m of sanitation). We also aspire to see a Global Action Plan for water and sanitation, agreed by world leaders in 2010, and fully implemented globally. This presentation will explore the distinctive Tearfund brings to relief & development in water & sanitation, and will consider: - Applying core development values of sustainability, replicability, empowerment, community-participation, holistic (whole-person) transformation. - Case studies of Watsan programmes which both exemplify and struggle with realising these values: What typifies our successes? and, What typifies our struggles? - Key approaches and concerns: - Sanitation, especially “demand-led” approaches, as opposed to “supply-driven”. - Environmental sustainability - Building partner capacity - Mainstreaming gender - Engaging in effective advocacy in Watsan.

 

Multi-agent modelling and applications to robotics and cognition
Dr Aladdin Aayesh
19 Nov 2008Harrison 203 Wedensday 2pmInformatics RI
The link between emotion, cognition and behavior is well investigated in psychological and social studies. In recent years, system developers started to explore, mostly as part of agent technology and the concept of agency, cognition and its potential models that suitable for algorithmic representation, i.e. translation into programs that can work interactively in distributed environment. Emotion modeling has evolved as part of cognitive systems and been recognized as an important aspect of these systems which affects greatly reasoning on its both fronts: representation and inference. This interest in emotions and cognitive agents is helped by advances in games, simulations, and architectures for domestic devices (i.e. robots). Whilst cognitive science is applied into developing cognitive agents, cognitive agents are often used to simulate and test cognitive models and their impact on human perception and responses. The social simulation in particular (e.g. crowd simulation in games) depends on agent technology that utilize some aspect of cognition and behaviorism. Some new studies started to incorporate aspects of emotions to influence agent behavior in a group evolving into an artificial society with emergent social and collective properties, e.g. panic.

 

The Anglian Water Alliance – Modelling Experiences from AMP4 and future challenges for AMP5
James Lau (Black & Veatch)
18 Nov 2008Harrison 170 Tuesday 2pmInformatics RI
The Anglian Water Alliance is a virtual company setup to deliver the AMP4 Capital Delivery program for Anglian Water Plc. The partnership of consultants, contractors and client has formed an effective delivery vehicle focussed on delivering value and innovation. Computational modelling has played a key role in delivering appropriate and cost effective solutions for schemes with varying complexities and problems. This presentation focuses on a number of schemes where novel and innovative approaches have been used. Challenges for AMP5 will also be discussed.

 

Linking AHP and Social Choice Methods in Group Decision-Making
Bojan Srdjevic (University of Novi Sad, Serbia)
11 Nov 2008Harrison 170 Tuesday 2pmInformatics RI
The social choice (SC) theory is in close relation with multicriteria decision-making (MCDM), especially in group decision contexts. SC theory includes various voting systems while MCDM is represented by utility and outranking methods; among utility models, the analytic hierarchy process (AHP) is probably the most popular in group decision support. Two possible contexts in modeling decentralized decision problems in water management will be investigated. The first is based on AHP only and two group aggregation techniques. The second one assumes the AHP application in subgroups, while at a group level, aggregation is performed by the SC voting procedures. Comparative analyses show good agreement of the results when two methodologies are applied as the decision support to the water committee of given river basin. The second methodology is called AHP+SC and is considered more promising for implementation in real-decision situations in water management.

 

Out of sight - out of mind - travels of a water engineer
Jo Parker
10 Nov 2008Harrison 102 Monday 1pmInformatics RI
Presenter's experiences in Afghanistan, Bosnia, Jamaica, Philippines, Indonesia and Iran focussing on buried assets, leakage detection and water quality/hygiene.

 

Evidential Reasoning in Water Distribution System Operations
Josef Bicik (University of Exeter)
4 Nov 2008Harrison 170 Tuesday 2pmInformatics RI
Due to aging infrastructure, incidents (e.g. pipe bursts) in water distribution systems (WDS) occur on a daily basis and can lead to significant losses of potable water and interruption of service to customers. A method to support decision making of WDS operators in failure situations, based on information fusion using Dempster-Shafer theory of evidence, will be presented. Information coming from several sources (e.g. telemetry systems, historical records and customer contacts) is combined together to provide the operator with a better insight into the most likely location of a failure in a WDS. The method is placed within the context of a decision support system for near real-time WDS operation and its requirements and potential further developments are identified and discussed.

 

Multi-objective league tables
Richard Everson (University of Exeter)
28 Oct 2008Harrison 170 Tuesday 2pmInformatics RI
As multi-objective optimisation methods are extended to many-objective problems, there is a need to visualise the mutually non-dominating sets produced. We present a new way of using barycentric coordinates to map a non-dominating set onto the plane for visualisation. The fidelity of the mapping is optimised by minimising the graph Laplacian of the nearest neighbour distances graph. League tables ranking, for example, universities are commonly produced by combining making a weighted sum of several measures into a score. Recognising this as a multi-objective problem, we show how to divide the universities into a series of mutually non-dominating sets -- Pareto shells -- and thus provide an unambiguous division into tiers. There are 5 shells for the scores that yield the THES 2008 university league table. We visualise the shells and examine various ways of ranking within a Pareto shell. The effect on the ranking of measurement noise in the scores is evaluated.

 

Pareto-Optimality of Cognitively Preferred Polygonal Hulls for Dot Patterns
Antony Galton (University of Exeter)
21 Oct 2008Harrison 170 Tuesday 2pmInformatics RI
In several areas of research one encounters the problem of generating an outline that is in some way representative of the spatial distribution of a pattern of dots. Several different algorithms have been published which can generate such outlines, but the detailed evaluation of such algorithms has mostly concentrated on their computational and mathematical properties, while the adequacy of the resulting outlines themselves has been left as a matter of informal human judgment. In this paper it is proposed to investigate the perceptual acceptability of outlines independently of any particular algorithm for generating them, in order to determine objective criteria for evaluating outlines from the full range of possibilities in a way that is conformable to human intuitive assessments. For the sake of definiteness it is assumed that the outline to be produced is a simple closed polygon whose vertices are elements of the given dot pattern, all remaining elements of the dot pattern being in the interior of the polygon. It is hypothesised that to produce a cognitively acceptable outline one should seek simultaneously to minimise both the area and perimeter of the polygon, and that therefore the points in area-perimeter space corresponding to cognitively optimal outlines will lie on or close to the Pareto front. A small pilot study was conducted, the results of which lend strong support to the hypothesis. The paper concludes with some suggestions for further more detailed investigations.

 

Urban Water Optioneering Tool
Evangelos Rozos (University of Exeter)
14 Oct 2008Harrison 209 Tuesday 2pmInformatics RI
UWOT is a decision support tool that simulates the urban water cycle by modelling individual units (e.g. toilets, washing machines, treatment units, tanks, reservoirs) and assessing their combined effects at the development scale. UWOT simulates not only the standard urban water flows (potable water, wastewater and runoff) but also the water flows (greywater, treated greywater or greenwater – including harvested rainwater) introduced by the advanced water saving and recycling schemes. A hierarchical structure is adopted for the representation of the development water features. The local appliances are at the bottom level of the hierarchy, on a higher level is the household type and at the top level is the development. UWOT is linked with a database that contains information over the major characteristics of each technological category. The selection of technologies for a given urban water cycle configuration may be done manually by the user or may be done automatically by an optimisation algorithm.

 

The Severn Barrage and Other Options - the Environmental Impact
Prof Roger Falconer (Cardiff University)
3 Jul 2008Harrison 171 Thursday 4pmInformatics RI
In recent years there has been growing international public concern about climate change, global warming, reducing the carbon footprint, increasing oil prices and the rapid depletion of oil reserves. These issues, as well as others, have led to renewed enthusiasm to consider proceeding with the Severn Barrage, either in the original location proposed by the Severn Tidal Power Group, or at a smaller scale such as the Shoots Barrage. In particular, the Sustainable Development Commission concluded in its 2007 report on ‘Turning the Tide: Tidal Power in the U.K.’ that ‘Electricity from a barrage would displace output from fossil-fuelled power stations, making a significant contribution to the UK’s renewable energy targets’. As well as the Severn Barrage there is considerable scope for various forms of tidal renewable energy along the Severn Estuary, as well as the North Wales coast and much of Scotland. The presentation will review the current main Severn Barrage proposals, as originally promoted by the Severn Tidal Power Group, together with giving a brief overview of alternative options such as the Shoots Barrage and Offshore Tidal Impoundments. In particular, emphasis will focus on assessing the potential hydro-environmental impact of a barrage, including the implications for geomorphological and flood risk changes. Finally, an outline will be given of recent research undertaken by the Hydro-environmental Research Centre, at Cardiff University, on bacterial-sediment interactions and the application of computational hydro-environmental models to predict the impacts of a Severn Barrage on the tidal currents, sediment transport concentrations and the background bacterial and water quality levels. The presentation will show that the Barrage has the potential to reduce the tidal currents in a highly dynamic estuary. This will lead to reduced suspended sediment loads (particularly upstream of the barrage), increased light penetration within the water column and, potentially, an increase in the benthic bio-diversity and the level of aquatic life in the estuary.

 

Singapore Membrane Technology Centre, NTU UNESCO Centre for Membrane Science & Technology, UNSW
Tony Fane (University of Oxford)
12 Jun 2008Harrison 171 Thursday 4pmInformatics RI
Membrane technologies are now key processes in the water industry. This presentation briefly introduces the various membrane technologies and their historical development as well as trends in costs and energy usage. The potential benefits of membranes are discussed and their roles in water supply, desalination, water reclamation and wastewater treatment are described in terms of state-of-the-art, and emerging novel concepts. Singapore provides an interesting case study of how membrane technology can be used in the water cycle. The application of membranes to decentralization is also attractive. Finally the challenges facing membrane technology, including product water quality issues, energy usage and GHG emissions, are discussed.

 

Low Impact Urban Design and Development
Jeremy Gabe (Landcare Research)
10 Jun 2008Harrison 107 Tuesday 12pmInformatics RI
The talk is about the results of current research in New Zealand, mostly on issues related to sustainable urban development (low impact urban design and development).

 

Cancelled
Professor Miranda (Plymouth University)
15 May 2008Harrison 170 Cancelled 2pmInformatics RI
Cancelled

 

Treating Uncertainties in the Real-world Scheduling
Prof. Sanja Petrovic's (University of Nottingham)
8 May 2008Harrison 171 Thursday 4pmInformatics RI
Scheduling represents a major issue in modern production planning and control. Ever since the first results of deterministic scheduling theory appeared around 50 years ago, scheduling research has attracted a lot of attention from both academia and industry. Consequently, the scheduling literature is very rich. However, majority of scheduling models and algorithms assume that all parameters are well-known and precisely defined. However, this is rarely the case in real-world problems where activities are often fraught with uncertainties. This often prevents the results of deterministic scheduling theory from being applied in practice. This talk will introduce a number of real-world scheduling problems and uncertainties that they face. Three ways of using fuzzy sets in scheduling problems will be discussed: (a) in modelling of uncertain scheduling parameters and constraints, (b) in multicriteria decision making in which satisfaction grades of each of the criteria defined to evaluate the quality of the schedules are introduced, and (c) in the rules that schedulers use to draw conclusions based on imprecise premises. Some new directions in raising the level of generality of scheduling algorithms that work well over a range of scheduling problems will be discussed.

 

An Approach to Machine Musicianship
Marcelo Gimenes (Plymouth University)
24 Apr 2008Harrison 171 Thursday 4pmInformatics RI
Marcelo Gimenes investigates the genesis and development of musical styles in artificial worlds. He developed a real-time computer musical system, iMe (Interactive Musical Environments) with which it is possible to observe a number of processes regarding musical perception and cognition and to evaluate how musical influence can lead to particular musical world views. The system has successfully been used in a real performance of a two piano improvisation piece, one of which played by an artificial agent. This experiment allowed the observation of a unique musical experience, connecting the human previous musical background to the complexities of an enhanced human/ machine dialogue

 

Urban Drainage Modelling in Thames Water
Andrew Hagger (Thames Water)
17 Apr 2008Harrison 171 Thursday 4pmInformatics RI
TBA

 

KALYPSO – An Open Source Software Tool for Flood Studies in Rivers
Professor Erik Pasche (Hamburg University of Technology)
13 Mar 2008Harrison 171 Thursday 4pmInformatics RI
As a result of climate change, urbanization and land-use changes, floods are becoming more frequent and causing increased loss and damage to property and life. The EU water policy reacted with a paradigm change in its water policy from blocking off flood prone areas (with dikes and walls) to give water more space and live with the flood. The consequence is the need to expand the relationship between the city, space and water and to improve stake-holders’ capacity to adapt to flood risk. These new guidelines for flood management require a good understanding about the impact of river morphology and anthropogenic changes on the flow regime in rivers and on the probability of inundation. Research Activities in the last dec-ades have considerably improved this understanding leading to sophisticated mathematical fluvial flow models. They range from 1-dimensional unsteady flow models to 2- and 3-dimensional hydrodynamic models, which make use of refined roughness concepts and turbu-lence approaches and accomplish a high resolution of the topography. The engineering world should have access to theses instruments in a most flexible and effi-cient way being able to combine 1- and 2-dimensional models (hybrid modeling), to select refined roughness concepts for vegetated flood plains (physically based roughness modeling) and to compare different turbulence approaches. Also the enormous amount of geographical data calls for an efficient and versatile data management and visualization tool. Thus present research concentrates more on model integration and data mining than model generation. Since two years a team of researchers at the Institute of River & Coastal Engineering at TUHH/Hamburg together with engineers from Björnsen Consulting Engineers Koblenz have developed such an integration shell for fluvial flow modelling called KALYPSO (www.kalypso.wb.tu-harburg.de). This Open Source software system for flood risk modelling is based on OGC-standards (www.opengeospaial.org) and provides an Open GIS user inter-face for map-based data access and input. A Web Map Service (WMS) based on an imple-mentation of the Open Source software deegree (www.deegree.org) provides access to GIS-data via Internet. With a work flow browser the user is guided through the tasks of flood modelling in a logical order including various tools for grid generation and boundary data management (pre-processing), for defining and starting the simulation cases (processing) and for analysing and visualization of the simulation results in inundation maps, flood damage maps and flood risk maps (post-processing). For the modelling of fluvial flow hybrid model-ling technique is available integrating 1- and 2-dimensional discretization elements and St. Venant and Shallow Water Equations. This new modelling shell for fluvial floods will be presented and its theoretical concept illus-trated. The hybrid modelling technology, the data mining functionality and the open GIS-based GUI will be explained and its application demonstrated at real cases of fluvial floods

 

Water Distribution System Modelling and Calibration
Daniel Kozelj (University of Ljubljana)
28 Feb 2008Harrison 171 Thursday 4pmInformatics RI
The seminar will present the Slovenian experience on water distribution system modelling together with the fields of calibration, sampling design and optimisation. An overview will be given on the aforementioned issues and project results will be presented. The projects presented will include: (1) Calibration of a WDS model by Genetic Algorithms (part of Ljubljana WDS – 35,000 inhabitants), (2) Sampling Design for Calibration of WDS by GA (Sezana – 11,000 inhabitants), (3) Rehabilitation of a WDS by GA (Logatec – 11,000 inhabitants) and (5) Experiences of modelling WDS and macrocalibration (Skofja Loka – 22,000 inhabitants and Celje – 65,000 inhabitants). Finally, the role of the tools in facilitating these processes will be discussed based on the findings of the real life applications.

 

WDS Pressure Management for Water Loss Reduction: State of the Art
Julian Thornton (Thornton International Ltd.)
19 Feb 2008Harrison 215 Tuesday 4pmInformatics RI
Water distribution system pressure management encompasses several approaches and has a number of important benefits; it has been referred to as “the preventative method par excellence” of water loss management. Changes in leak flow rates and some components of consumption are now reasonably predictable however there has been little published data as to how improved management of excess pressures and surges can influence new break frequency of mains and services. This discussion will encompass the types of pressure management utilized in the industry today, how to assess potential volume benefits from pressure management and the challenge which lies ahead to improve the industry understanding of how to assess pressure break frequency relationships. Several case studies will be cited during the discussion.

 

Water & Sewerage Capital Maintenance Risk Modelling
Nathan Muggeridge, and Mark Randall-Smith (Mouchel Parkman)
14 Feb 2008Harrison 170 Tuesday 4pmInformatics RI
The Water Companies are required to prepare Asset Management Plans (AMP) every 5 year and these determine the level of charges for water and sewerage services. The Asset Management Plan for the period between 2010 and 2015 is currently being prepared and OFWAT, the water industry Regulator, requires the capital maintenance elements of the plans to adopt a risk-based approach. This presentation provides an insight into the preparation of an Asset Management Plan for water and sewerage infrastructure, and demonstrates the role of research in the development of the plans. It will also highlight some of the current problems with implementing a risk-based approach to capital maintenance.

 

Low Energy and Sustainable Cooling of Underground Railway Systems
Jolyon Thompson (Research Engineer - Cooling the Tube Programme)
7 Feb 2008Harrison 170 Tuesday 4pmInformatics RI
Underground railway system usage is growing throughout the developing world and in many cities the underground railway is the most commonly used form of public transport. The intense service provided on these systems generates substantial quantities of rejected heat energy. This energy can significantly increase air temperatures within the trains and tunnels. When coupled to high ambient temperatures this can lead to passenger discomfort and health issues. Conventional air conditioning systems have been used in some modern underground railway installations but their operation has had limitations and leads to highly energy intensive solutions. Conventional air conditioning often cannot be included in older systems through heat rejection and spatial problems. Sustainable cooling systems could reduce the overall system energy usage and provide an acceptable environment for passengers. These could include energy management methods such as reduced train velocity, low weight carriages as well as sustainable cooling technologies that have been introduced from modern building services engineering such as Groundwater and geothermal cooling. However due to the increased complexity of the air flow paths, the increased heat density and the transient nature of the situation the problem provides many unique and difficult engineering challenges. Most low energy and sustainable systems that could be used on underground railways have been developed and used in more conventional building services applications. Ventilating an underground railway environment, in which there is to be heavy traffic of electrically-propelled rapid-transit trains, differs from those normally encountered in conventional building-services applications. The heat generated by the train motors and electric lighting, together with body heat from passengers, is so great that excessive temperatures would prevail in summer when limited cooling is available. During this seminar the drivers behind the low energy and sustainable cooling will be briefly looked at and some possible methods will be briefly discussed. The seminar will then provide detailed discussion of groundwater cooling, evaporative cooling and geothermal cooling. The seminar will finally conclude with some general remarks about the objectives and strategies of introducing sustainable and low energy cooling to an operating underground railway.

 

Evolving the Water Industry to the Next Generation (With a Satistical Bent) - Lessons We Can Learn from Other Industries
Tim Watson (MWH UK Ltd)
31 Jan 2008Harrison 170 Tuesday 4pmInformatics RI
The recent drive within the water industry has been towards ‘whole of life’ costing, necessitating the need for robust and quantifiable forecasting of asset performance. However, it is generally accepted that data is sparse, unreliable, and of poor quality. Therefore, constructing robust statistical forecasting models remains a challenge. In this talk, we will highlight the above problem with two examples. Firstly, we believe that the current OFWAT directive of ‘business as usual’ – embedding a decision support system within the assets themselves – is an excellent first step to begin to populate the necessary data sets. Secondly, as a comparison to the water industry, we will provide examples from other industries where such a ‘business as usual’ undertaking has been achieved over the past 10 years, resulting in an environment that is data rich and therefore able to produce robust forecasts of future investment.

 

On the efficient use of uncertainty when performing expensive ROC optimisation
Dr Jonathan Fieldsend (UoE)
24 Jan 2008Harrison 171 Thursday 3pmInformatics RI
When optimising receiver operating characteristic (ROC) curves there is an inherent degree of uncertainty associated with the operating point evaluation of a model parameterisation x. This is due to the finite amount of training data used to evaluate the true and false positive rates of x. The uncertainty associated with any particular x can be reduced, but only at the computation cost of evaluating more data. Here we explicitly represent this uncertainty through the use of probabilistically non-dominated archives, and show how expensive ROC optimisation problems may be tackled by only evaluating a small subset of the available data at each generation of an optimisation algorithm. Illustrative results are given on data sets from the well known UCI machine learning repository.

 

Rectilinearity Measure for Detecting Buildings on Satellite Images
Dr Jovisa Zunic (UoE)
17 Jan 2008Harrison 171 Thursday 3pmInformatics RI
Rectilinear structures often correspond to human-made structure, and are therefore justified as attentional cues for further processing. For instance, in aerial image processing and reconstruction, where building footprints are often rectilinear on the local ground plane, building structures, once recognized as rectilinear can be matched to corresponding shapes in other views for stereo reconstruction. Perceptual grouping algorithms may seek to complete shapes based on the assumption that the object is question is rectilinear. Using the proposed measure, such systems can verify this assumption.

 

TBA
Dr Richard Everson (UoE)
10 Jan 2008Harrison 170 Tuesday 2pmInformatics RI
TBA

 

Sewer Infrastructure Capital Maintenance - Theory and Practice
Alec Erskine (MWH UK Ltd)
6 Dec 2007Harrison 170 Tuesday 2pmInformatics RI
A discussion of the impact of the Common Framework on asset management in the water industry particularly the sewerage infrastructure. Attempts to apply risk theory and cost benefit analysis to the CCTV inspection of sewers prior to intervention. What actually happens in practice at selected water companies.

 

Real Life MCDM Applications in Water Resources Management: Case Studies – Serbia and Brazil
Prof. Bojan Srdjević (University of Novi Sad)
4 Dec 2007Harrison 209 Tuesday 4pmInformatics RI
An overview of two real life applications will be presented: (1) Serbian Case Study - Participative decision-making at regional hydro system level based on AHP methodology (Nadela system in Vojvodina Province); and (2) Brazilian Case Study - Assessment of water allocation scenarios in a reservoir system by combined use of river basin simulation models and MCDM methods (Tandem reservoirs França and São Jose de Jacuípe in Paraguacu river basin, Bahia). The AHP multicriteria method will be presented in brief as well.

 

System Dynamics Modelling (SDM) for the Simulation of Complex Water Systems
Lydia S. Vamvakeridou-Lyroudia (University of Exeter)
29 Nov 2007Harrison 170 Tuesday 2pmInformatics RI
System Dynamics Modelling (SDM) is a methodology for studying and managing complex feedback systems, typically used when formal analytical models do not exist, but system simulation can be developed by linking a number of feedback mechanisms. They are particularly useful for building, understanding and presenting models to non-engineers. Moreover they are useful for developing models in participatory interdisciplinary context, as is the case for most European Commission (EC) projects. This seminar presents the procedure for developing SDM for complex water systems within the EC FP6 project AQUASTRESS, followed by two specific application case studies.

 

Physical Modelling of Venford Dam Spillway
Professor Godfrey Walters (University of Exeter)
22 Nov 2007Harrison 170 Tuesday 2pmInformatics RI
Following a request from South West Water, a physical model of the spillway structure and toe works of Venford Dam on Dartmoor was built at a 1:20 scale in the Fluids Laboratory. This talk outlines the background to the study, the theoretical and practical details of constructing the model, and the outcomes of a series of tests on the existing structure and proposed amendments.

 

Lessons from BedZED
Chris Shirley Smith (Director of Water Works UK Ltd)
15 Nov 2007Harrison 170 Tuesday 2pmInformatics RI
The lecture will cover the development of BedZED as planned, the construction period and the water management plan. Things did not go entirely as planned and the lessons which can be drawn from the project are opened for viewing and subsequent discussion.

 

Recent Advances in Data Assimilation in Large Scale Hydrodynamic and Hydrological Forecasting Models
Henrik Madsen (DHI (Danish Hydraulic Institute))
8 Nov 2007Harrison 170 Tuesday 2pmInformatics RI
The use of data assimilation in hydrodynamic and hydrological forecasting systems has advanced considerably in recent years. This talk gives a review of the developments, considering the data assimilation problem within a general filtering framework. This framework incorporates updating of different modelling components in the forecast system, including model forcing, model state, and model parameters. It includes as a special case the classical Kalman filter. Various extensions of the filter especially tailored towards operational applications are reviewed. These include (i) approximate Kalman filter schemes that utilize cost-effective approximations of the error modelling, (ii) combination of filtering and forecasting of model prediction errors, (iii) filtering with coloured or biased model errors, and (iv) application of regularization techniques.

 

Water Distribution Network System Modelling in China: Opportunities and Challenges
Professor Hongbin Zhao, Director, Water and Wastewater Systems Research Center (Harbin Institute of Technology, China)
1 Nov 2007Harrison 170 Tuesday 2pmInformatics RI
There is a large demand in modelling water distribution networks and systems in China, and with that, a huge potential market for modellers and model developers. However, the modelling community in China is still relatively small in size compared to Western countries, and research and development in still relatively slow. This talk will attempt to: (i) discuss and summarise various hurdles currently faced by modellers and model developers in China; (ii) discuss some of the problems in using hydraulic models to simulate water distribution networks; and (iii) propose some new directions in water distribution research.

 

New Approaches to Adaptive Water Management Under Uncertainty (NeWater)
Raziyeh Farmani (University of Exeter)
25 Oct 2007Harrison 170 Tuesday 2pmInformatics RI
In this talk an introduction will be given to the NeWater project as an Integrated Project (IP) funded by European Commission (EC) under sixth Framework Programme (FP6). This will be followed by introduction to adaptive water resources management. The need for transition from prevailing water management regimes towards adaptive regimes of water facing climate, global and socio-economic boundary conditions changes will be highlighted. Finally, the role of the tools in facilitating these processes will be discussed and some real applications will be presented.

 

Water Resources Modelling Tools - The ODYSSEUS Project
Evangelos Rozos (University of Exeter)
18 Oct 2007Harrison 170 Tuesday 2pmInformatics RI
The subject of the “ODYSSEUS” project was the development of an integrated system of computing tools, which, in combination with a parallel framework of methodologies and technical specifications, provided an infrastructure suitable for a rational and sustainable management of water resource systems at a variety of scales. The research teams that participated to the project implementation derived from the academic field, the private field and the local authorities. The project was funded by the European Development Fund (EDF), the European Social Fund (ESF), the Greek State and Greek private organisations, in the framework of the Operational Project “Competitiveness” of the Third Community Cohesion Fund. The project duration was three years, and begun at July 2003.

 

Asset Deterioration, Multi-Utility Data and Multi-Objective Data Mining
Dragan Savic (University of Exeter)
11 Oct 2007Harrison 170 Tuesday 2pmInformatics RI
Physically-based models derive from first principles (e.g. physical laws) and rely on known variables and parameters. Because these have physical meaning, they also explain the underlying relationships of the system and are thus usually transportable from one system to another as a structural entity, while only the model parameters have to be updated. Data-driven or regressive techniques involve data mining for modelling and one of the major drawbacks of this is that the functional form describing relationships between variables and the numerical parameters is not transportable to other physical systems as is the case with their classical physically-based counterparts. Aimed at striking a balance, Evolutionary Polynomial Regression (EPR) offers a way to model multi-utility data of asset deterioration in order to render model structures transportable across physical systems. EPR is a recently developed hybrid regression method providing symbolic expressions for models and works with formulae based on true or pseudo-polynomial expressions, usually in a multi-objective scenario where the best Pareto optimal models (parsimony vs. accuracy) are selected from data in a single case study. This article discusses the improvement of EPR for dealing with multi-utility data (multi-case study) advances data-driven modelling while achieving a general model structure for asset deterioration prediction across different water and wastewater systems.

 

Two Stochastic Approaches for Benchmarking Urban Water Systems
Michael Möderl (University of Innsbruck)
4 Oct 2007Harrison 170 Tuesday 2pmInformatics RI
A traditional procedure for performance evaluation of systems is to test approaches and methodologies on one or more case studies. However, it is well known that the investigation of real case studies is a tedious task. Moreover, due to the limited amount of case studies available it is not certain that all aspects of a problem can be covered in such procedure. With increasing computer power an alternative methodology has emerged, that is the investigation of a multitude of virtual case studies by means of a stochastic consideration of the overall performance. Within the frame of this approach we develop here a Modular Design System for water distribution systems and we will develop and Case Study Generator for urban drainage systems. With the algorithmic application of such tools it is possible to create a variety of different virtual case studies. Additionally the benchmark of the virtual case studies is shown by an application example, where 2,280 different water distribution systems are evaluated.

 

Engineering Civilisation from the Shadows
ICE - Institute of Civil Engineers - South West (6th Brunel International Lecture)
11 Jul 2007Royal Clarence Hotel - Wednesday 6pmInformatics RI

This lecture will examine world poverty and climate change in the 21st century, focusing on the role of engineering in addressing these challenges in relation to the Millennium Development Goals.

The lecture is free to attend but it is necessary to register attendance.

** Places are limited, so register early to avoid disappointment **

Contact Barbara Davey, Regional Administrator:
t +44 (0)1626 879 836
e barbara.davey@ice.org.uk
w ice-southwest.org.uk

ICE South West
10 Newton Road, Bishppsteignton
Teignmouth, Devon TQ14 9PN

 

The Design and Rehabilitation of Water Distribution Systems Using the Hierarchical Bayesian Optimisation Algorithm
Ralph Olsson (UoE)
25 May 2007Harrison 209 Friday 12noonInformatics RI
The design and rehabilitation of water distribution systems is considered here as a constrained least cost optimisation problem. In the design and rehabilitation of water distribution systems the solution space tends to be vast. This paper uses the hierarchical Bayesian Optimisation Algorithm (hBOA) (Pelikan and Goldberg, 2000) to search the solution space with the aim of decreasing the number of fitness evaluations required for convergence. hBOA is a probabilistic model building genetic algorithm which uses a Bayesian network to model the set of promising solutions in each generation. This network is in turn sampled to generate the offspring to be incorporated into the next generation. hBOA has been shown to solve a number of challenging problems in sub-quadratic time, that is the number of fitness evaluations required for convergence is of order less than the square of the number of design variables. Since a fitness evaluation requires the solution of a set of hydraulic equations there are significant time benefits to be had by reducing the number of evaluations. hBOA is applied to two sample networks as deterministic, single objective problems; the New York Tunnels problem and the 'Anytown' network. Initial results show hBOA to compare favourably with other evolutionary computation techniques in the literature, both in terms of the proportion of runs in which the best known solution is found and in terms of the number of fitness evaluations required.

 

Imprecise probabilities of climate change: aggregation of fuzzy scenarios and model uncertainties
Dr. Guangtao Fu (UoE)
18 May 2007Harrison 209 Fri 12noonInformatics RI
Whilst the majority of the climate research community is now set upon the objective of generating probabilistic predictions of climate change, disconcerting reservations persist. Attempts to construct probability distributions over socio-economic scenarios are doggedly resisted. Variation between published probability distributions of climate sensitivity attests to incomplete knowledge of the prior distributions of critical parameters in climate models. This presentation addresses these concerns by adopting an imprecise probability approach. We think of socio-economic scenarios as fuzzy linguistic constructs. Any precise emissions trajectory can be thought of as having a degree of membership in a fuzzy scenario. Rather than attempting to distribute an additive probability measure across scenarios a weaker assumption is adopted in monotonic (but non-additive) measures. It is demonstrated how fuzzy scenarios can be propagated through a low-dimensional climate model, MAGICC. Fuzzy scenario uncertainties and imprecise probabilistic representation of climate model uncertainties are combined using random set theory to generate lower and upper cumulative probability distributions for Global Mean Temperature. Further, application of this method in a decision-making context is demonstrated through flood risk analysis in Thames defence system.

 

Bayesian modelling of time aggregated water pipe bursts with a zero-inflated, non-homogeneous Poisson process.
Theodoros Economou (UoE)
11 May 2007Harrison 209 Fri 12noonInformatics RI
A commonly used approach to modelling recurrent failures is based on a non-homogeneous Poisson process (NHPP) and requires data on actual failure times. Modelling and predicting bursts in underground water pipes is vital to water companies from both an economic and conservation perspective, but often does not allow for use of a conventional NHPP for two reasons. Firstly, because data is commonly only recorded on numbers of failures over a (relatively long) time period and not on exact failure times. Secondly, because failures are usually only observed in a very small proportion of pipes in the network. This paper proposes a model derived from the conventional NHPP which only makes use of numbers of failures in an observed time period and the age of each pipe at the end of this period, but is still able to capture the age deterioration phase of the reliability curve. The model is then further extended to account for censoring and truncation in the data as well as an excess of zeros. Application of this `aggregated' model and its zero-inflated extension are illustrated on a data set involving a network of 532 cement water pipes in Manukau City, Auckland, New Zealand.

 

Evolutionary computation-based meta-modelling - a way forward to run thousands of large simulation?
Dr Soon-Thiam Khu (UoE)
4 May 2007Harrison 209 Fri 1pmInformatics RI
Evolutionary algorithms such as genetic algorithms (and other population-based search algorithms) are increasing used to solve many engineering optimisation problems.Although these algorithms have the distinct advantage of able to find near-global solutions relatively fast, the bottleneck of using evolutionary algorithms for real-world optimisation is due to the computational intensive running of the engineering simulation model. The talk will attampt to examine various ways of resolving such a problem while proposing a relatively new method known as evolutionary computation-based meta-modelling. This talk should be of interests to all modellers (like myself) being fustrated with long simulation times using GA optimisation. It will also be a great opportunity for computer scientist to share their experience in resolving such problems.

 

Challenges in the development of "non-powered, no moving part" technologies for urban water management
Dr Mike Faram (Hydro International plc)
5 Apr 2007Harrison 107 Thursday 2pmInformatics RI
Founded in 1980, Hydro International has developed a reputation as being an innovative and forward thinking party within the water management "equipment supply" sector. The company ranks among the top 800 UK R&D spenders overall and among the top 50 "industrial engineering" spenders. The presentation, by the group's Technical Manager, will provide an introduction to the company, focusing in particular on its technologies, many of which make clever use of fluid-dynamic or hydraulic principles in their operation. The challenges faced in developing such "simple in form" yet "complex in operation" systems will be discussed. This will consider the use of experimental methods and numerical techniques (including Computational Fluid Dynamics (CFD)) in product development, and the challenges faced in converting the outputs into practical performance prediction and design selection models.

 

Integrated wastewater asset management for small catchments
Guo, Yufeng (South West Water)
30 Mar 2007Harrison 102 Friday 1pmInformatics RI
Advanced asset management of wastewater systems has recently gained increasing support within industry and research. This project presents a decision support tool (DST) to address integrated management of wastewater assets in South West Water, including wastewater treatment works, sewers and wastewater pumping stations. In the DST, a catchment ranking system based on key performance indicators is set up to identify critical catchments. Proactive interventions are suggested with their potential impacts on cost, performance and failure trends. In the end, cost-effective maintenance strategies are accommodated, which can be employed into the industry capital maintenance plan and utilised as a part of "business as usual" operational management.

 

Flood Risk Management
Dr Slobodan Djordjevic (University of Exeter)
16 Mar 2007Harrison 102 Friday 1pmInformatics RI
Flood Risk Management Research Consortium (FRMRC) is the ongoing research project involving about thirty UK institutions, with total budget in excess of £6m. Following an overview of the Consortium and the background of the CWS involvement, outputs from Work Package 6.1 Urban Flooding to date will be presented. Various problems in development of modelling approaches will be discussed and the examples from case studies will be shown. The following topics will be covered: – GIS-based tools to support integrated modelling of urban flooding – Interactions between below-ground and above-ground systems – Coupled 1D/1D and 1D/2D simulation of urban flooding – Sensitivity-based flood risk attribution Research themes envisaged for the second phase of the Consortium (FRMRC2, which is currently being negotiated with EPSRC and other funders) will be addressed, as well as some other projects that will result from this research in the near future.

 

Spatial survival modelling and the 2001 UK foot-and-mouth disease epidemic
Professor Trevor Bailey (UoE)
9 Mar 2007Harrison 102 Fri 1pmInformatics RI
The potentially large economic impacts of animal disease epidemics have been highlighted in recent years through outbreaks such as foot-and-mouth disease (FMD) in the UK during 2001. This talk reports work from a project with the Veterinary Laboratories Agency (VLA), Weybridge, UK which is concerned with use of survival modelling to develop dynamic space-time predictions of survivor and hazard functions for individual farm premises as an animal disease epidemic progresses. Such survival analyses could provide powerful insights into the patterns of infection, and assist in optimising various aspects of the operational response activities, such as targeting of `at-risk' premises. The talk will discuss and compare various possible Bayesian model formulations using both real and simulated epidemics and then go on to illustrate how model predictions may be used to refine epidemic control policies

 

Managing Distribution Retention Time to Improve Water Quality
Malcolm Brandt (Black and Veatch Ltd)
1 Mar 2007Harrison 107 Thursday 2pmInformatics RI
When water leaves a treatment works and travels through a distribution system, its quality, with respect to many chemical and biological parameters, will degrade. The quality of the delivered water will be largely influenced by: The quality of treated water supplied into the network; The condition of distribution assets within the network; The retention time within the network. The water industry has focused predominantly on the quality of treated water and the physical condition of distribution assets when improving the quality of water at the customer’s tap. However the quality of the water delivered is also affected by the time the water is retained in the different elements of the distribution network. Retention time is controlled both by the physical characteristics of the system and the operational regime. Physical characteristics such as pipe roughness may change throughout the life of the asset or be modified by rehabilitation. The aim of this research is to demonstrate that water quality within distribution networks can be managed effectively by controlling retention time and to develop practical and pragmatic methodologies for doing so.

 

Evolutionary Optimisation for Microsoft Excel
Josef Bicik (UoE)
23 Feb 2007Harrison 102 Fri 1pmInformatics RI
The aim of this seminar is to introduce a new tool allowing the use of single and multiple-objective optimisations using genetic algorithms from within Microsoft Excel. The presentation will show the features of the tool and demonstrate them on several optimisation problems. Special attention will be paid to use of the tool in the optimisation of water systems, however it will be shown that it can be integrated with various other simulation packages.

 

An Operational Framework for Trading Water Abstraction Permits in England and Wales
Ana Maria Millan (University of Exeter)
9 Feb 2007Harrison 102 Fri 1pmInformatics RI
This study aims to show how the use of market forces to allocate water resources can improve the allocation of water resources in England and Wales. At present, there are growing concerns about the availability of water resources in England and Wales to satisfy increasing demands. These derive mainly from traditional consumers, from the requirements of the Water Framework Directive to protect the water-dependent environment, and from the effects of climate change on flow variability and drought frequency. These conditions show the vulnerabilities of the existing allocation system, which in turn, create an opportunity to propose changes.

 

Modelling Pollutant Adsorption: The atomistic view
Arnaud Marmier (UoE)
2 Feb 2007Harrison 102 Fri 1pmInformatics RI
Halogenated chemicals have been used as pesticides and generated as by products of industrial, commercial or social activities. Crucial information needed in order to assist stakeholders in developing remediation strategies for land contaminated by these pollutants includes data on how the pollutants move through a contaminated environment and how the pollutants change and degrade with time. Experiments to answer these questions are difficult to perform and require a significant investment of time. Consequently, such experiments are generally limited studies of a few specific molecules interacting with a small number of soil types. Atomic scale computational models provide a method complementary to experiment. They can produce additional information that aids our understanding of the fundamental processes that lead to pollutant mobility. In this communication, I focus on the adsorption of polychloro-dibenzo-p-dioxins (PCDDs) and polychloro-biphenyls (PCBs). In order to better understand how these pollutants move through soils I performed calculations at the atomic scale to resolve how they bind to the components of typical soils. The surface chosen is the (001) face of the di-octahedral 2:1 sheet silicate pyrophillite, Al4Si8O20(OH)4. This is a particularly simple example of the family of clay minerals that form a significant fraction of many soil and rock types. I will also discuss the possibilities offered by some eScience tools for high-throughput modelling of this sort.

 

HANDLING UNCERTAINTY IN REAL-TIME FLOOD FORECASTING SYSTEMS
Professor Ian Cluckie (University of Bristol)
1 Feb 2007Harrison 107 Thursday 2pmInformatics RI
The seminar will focus on the problems of handling uncertainty in complex real-time flood forecasting systems. This will inevitably introduce the use of closely coupled high-resolution mesoscale atmospheric models at one end of the scale and quantitative weather radar at the other. Some of the features of coupled models and systems will be introduced in relation to both urban and rural flood forecasting. The seminar will seek to provide an overall briefing on the area with an introduction to some of the exciting developments which are currently underway. This is particularly so in terms of the current efforts to utilise dual-polarisation weather radar technology for operational purposes in the United Kingdom.

 

Modelling of contaminant transport in soils
Mohammed Al-Najjar (University of Exeter)
15 Dec 2006Harrison 102 Friday 1pmInformatics RI
The movement of contaminants through soils to the groundwater is a major cause of degradation of water resources. In many cases, serious human and stock health implications are associated with this form of pollution. In this presentation, the development and validation of a numerical model for simulation of the flow of water and air, heat transfer and contaminant transport through unsaturated soils will be presented. All the major mechanisms and phenomena controlling transport of contaminants in soils (including advection, dispersion, diffusion, adsorption (under equilibrium and non-equilibrium conditions), chemical reactions and the effect of mobile and immobile domains on transport of contaminants in the soil) have been considered in the model. The mathematical framework and the numerical implementation of the model will be presented and described. The validation of the model will be presented by application to several experiments on contaminant transport in soils from the literature. The application of the model to some case studies will then be presented and discussed.

 

Simulated annealing and greedy search for multi-objective optimisation
Dr Richard Everson (University of Exeter)
8 Dec 2006Harrison 209 Friday 1pmInformatics RI
Evolutionary optimisation algorithms can be broadly categorised by the search techniques used. One such categorisation can be made upon the acceptance criteria for solutions within the algorithm; two popular criteria are to allow only greedy searching (to discard solutions which are not an improvement over previous solutions) and to retain new solutions by Metropolis-Hastings sampling (as used in simulated annealing). Another categorisation can be made comparing algorithms which sequentially optimise a single solution and those which consider a set of solutions for optimisation. In this seminar we introduce a new multi-objective simulated nnealing technique and illustrate the effect upon convergence of combinations of greedy searching, searching with an annealing schedule, sequential optimisation of a single solution and optimisation of a set of solutions. Apparently greedy strategies are found to be remarkably effective for many problems and we discuss the types of multi-objective problem that will prevent convergence of greedy optimisers.

 

Fuzzy multiobjective optimisation for WDS
Dr Lydia Vamvakeridou-Lyroudia (University of Exeter)
1 Dec 2006Harrison 102 Fri 1pmInformatics RI
This seminar presents the application of fuzzy reasoning, combined with hierarchical multiobjective optimisation for the design of water distribution networks, using genetic algorithms. The problem has been formally structured as a Multilevel Multicriteria Decision Making (MCDM) process with two objectives: cost minimisation and benefits maximisation, resulting in a Pareto trade-off curve of non dominated solutions. A number of criteria are introduced, individually assessed by fuzzy reasoning. The overall benefits function is a combination (aggregation) of criteria, according to their relative importance and their individual fuzzy assessment. It is obtained through an analytic hierarchy process (AHP), applied directly within the genetic algorithm, using an original mathematical approach. The decision maker enters data and preferences using exclusively linguistic “engineer friendly” definitions and parameters. In this way, the whole design algorithm moves away from strict numerical functions, and acts as a Decision Support System (DSS) for water network design optimisation. The model has been applied to “Anytown”, a well-known benchmark network, for which results are presented and discussed.

 

Multi-objective Optimization on Operation of Water Distribution Systems in China
Professor Liu (Tongji University, China)
30 Nov 2006Harrison 107 Thursday 2Informatics RI
Traditional research and application of optimal operation of water distribution systems was mostly based on functions of least cost in construction and energy consumption. With the fast urban development and the ageing pipe networks in the world, new targets and tasks have been arising out for water supply companies to deal with, e.g. water pressure satisfaction, water quality uncertainty, water leakage and bursts, environment requirements, etc. In the presentation, extended period simulation models and computer software on optimal operation of water distribution systems were introduced for assessment and forecasting of the hydraulic movements, water quality changes and towards to real time optimal operation and control, based on multiple objectives. The software was applied in several cities in China with valuable results for improving the further planning and operation techniques in water distribution systems. In addition, definitions of optimal diameters and economic leakage in the pipe network will be discussed.

 

Discolouration of water distribution networks
Dr Jan Vreeburg (KIWA (Netherlands))
24 Nov 2006Harrison 102 Fri 1pmInformatics RI
Discolouration is the single most important reason for customers to complain at their water company about the water quality. Traditionally discoloured water is assumed to be the result of corrosion of unlined cast iron pipes. In the Netherlands (and also elsewhere) however the problem also exists in completely plastic or cementitious pipes so the corrosion cannot be the single cause. In the last decade a lot of research has been done to the nature and cause of discoloured water that resulted in new knowledge and new approaches. Managing discoloured water in a network demands a multi-angle approach. In the Netherlands a three stage approach has been developed based on an understanding of the theory of particles in the network. Remedial actions are aimed at hydraulically controlling the accumulation of particles. The results of this approach are comprehensive guidelines for cleaning networks and designing of self cleansing networks. Continuing research in the Netherlands is now concentrating on the challenge of improving the water quality at the treatment to prevent to loading of the network and in that way try to get a proactive solution instead of the reactive measures as cleaning the network. The aim of the lecture is to demonstrate the new knowledge and analysis techniques and try to translate this into practical guidelines and new additional research challenges for the typical UK situation.

 

Translating conjunctive water management from concept to practice in mature irrigation systems
Imogen Fullagar (CSIRO, Australia)
17 Nov 2006Harrison 102 Fri 1pmInformatics RI
Environmental and production demand for water are increasing interest in realising system efficiencies of groundwater and surface water. This is difficult in mature irrigation systems because management regimes tend to separate groundwater and surface water responsibilities and accounting. Ms Fullagar's research has been to design scoping processes to identifying system efficiency opportunities, and deliver these through local (socio-economic and management) circumstances. The development of these processes has been based primarily on qualitative data from Coleambally, a rice-growing irrigation area in New South Wales, Australia. Feedback and comment on processes and research method is welcome.

 

Data-driven Approach to Asset Deterioration Modelling.
Prof Dragan Savic (University of Exeter)
10 Nov 2006Harrison 102 Fri 1pmInformatics RI

 

Multi-Objective Optimisation in the Presence of Uncertainty
Jonathan Fieldsend (University of Exeter)
3 Nov 2006Harrison 102 Friday 1pmInformatics RI
There has been only limited discussion on the effect of uncertainty and noise in multi-objective optimisation problems and how to deal with it. Here this problem is addressed by assessing the probability of dominance and maintaining an archive of solutions which are, with some known probability, mutually nondominating. Methods are examined for estimating the probability of dominance. These depend crucially on estimating the effective noise variance and a novel method for learning the variance during optimisation is discussed as part of this. Probabilistic domination contours are presented as a method for conveying the confidence that may be placed in objectives that are optimised in the presence of uncertainty.

 

TBA
Francesco Di-Pierro (University of Exeter)
27 Oct 2006Harrison 102 Friday 1pmInformatics RI
TBA

 

Discolouration in potable water distribution systems
Dr Joby Boxall (Pennine Water Group)
20 Oct 2006Harrison 102 Friday 1pmInformatics RI
Discolouration is one of the single biggest causes of water quality related customer complaints received by water supply companies. Hence, water companies wish to be able to predict the occurrence of discolouration events and implement appropriate operational and maintenance strategies to reduce such complaints. However there is a large degree of uncertainty around the processes that lead to discolouration events. This talk will present the development of a modelling approach towards predicting the occurrence of discolouration events and the results of field and laboratory studies undertaken to elucidate discolouration processes and mechanisms. For further background information, see the project web page www.PODDS.co.uk

 

Improved Methods for the Optimum Design and Operations of Water Distribution Systems
Prof. Graeme Dandy (University of Adelaide)
19 Oct 2006Harrison 170 Tuesday 2pmInformatics RI
A water distribution system (WDS) is a very complicated system of reservoirs, tanks, pipes, pumps and valves that supplies water to the population of a city. If the system is working properly, it will provide water on demand at adequate pressures and of a suitable quality. A great deal of research has been carried out into the optimum design and operations of WDS. Evolutionary techniques such as genetic algorithms (GAs) have proven to be extremely useful in solving these large non-linear optimisation problems. However, GAs have the disadvantage that they may require very long run times to identify near-optimal solutions. This presentation will provide some background on the development and application of genetic algorithms for optimising the planning, design and operations of WDS. The emphasis will be on the practical applications of this set of techniques to real water supply systems. There will be a discussion of recent developments aimed at producing significant speed up of GAs using meta-modeling. Meta-modelling works by combining the GA with a simplified (statistical or neural network) model of the system. Excellent results have been achieved at run times that are 100 to 1000 times faster than the GA alone. There will also be a discussion of recent research into the optimisation of water quality in WDS and the consideration of sustainability objectives in WDS.

 

Multiobjective optimization of water reuse systems
Darko Joksimovic (University of Exeter)
13 Oct 2006Harrison 102 Friday 1pmInformatics RI
This presentation will deal with the recently completed research project - AQUAREC, in which the role of the CWS was to derive design principles for integrated water reuse systems. This was achieved by first developing a simulation and optimisation methodology, incorporating it in a DSS named WTRNet, and applying the developed software on several case studies. Following a brief introduction into the concept of integrated water reuse planning, the DSS methodology will be presented with particular focus on its optimisation component. The presentation will conclude with a discussion of WTRNet application results and possible future work.

 

The Battle of the Water Sensor Networks: Multiobjective Cross-Entropy Approach
Gianluca Dorini and Philip Jonkerguow (University of Exeter)
6 Oct 2006Harrison 102 Friday 1pmInformatics RI

 

What is the Region Occupied by a Set of Points?
Dr Anthony Galton (University of Exeter)
5 Oct 2006Harrison 170 Tuesday 2pmInformatics RI
There are many situations in GIScience where it would be useful to be able to assign a region to characterize the space occupied by a set of points. Such a region should represent the location or configuration of the points as an aggregate, abstracting away from the individual points themselves. In this paper, such a region will be called a 'footprint' for the points. We investigate and compare a number of methods for producing such footprints, with respect to nine general criteria. The discussion identifies a number of potential choices and avenues for further research. Finally, we contrast the related research already conducted in this area, highlighting differences between these existing constructs and our 'footprints'.

 

An overview of the research activities at IIT Kanpur: Hydrological Modelling
Dr. Ashu Jain (ITT Kanpur, India)
1 Jun 2006Harrison 170 Tuesday 2pmInformatics RI
"In this talk I plan to first give a very brief background of the Indian Institutes of Technology (IITs), IIT Kanpur, and the department of civil engineering at IIT Kanpur. The research activities of the Hydraulics and Water Resources Engineering (HWRE) group at IIT Kanpur will be presented. The focus will be on hydrological modelling. Some case studies on water demand modelling and hydrological modelling will be presented. The methods employed include data-driven techniques of Artificial Neural Networks (ANNs) and Genetic Algorithms (GAs). Special issues such as difficulties in ANN training, integration of conceptual and data-driven approaches, etc. will be discussed and the results presented. If the time permits, a case study on the exploration of physical significance in trained ANN hydrologic models will also be presented."

 

Research in the Computer Graphics and Interactive Systems Laboratory at the Technical University of Cluj-Napoca in Romania
Professor Dorian Gorgan (Technical University of Cluj-Napoca)
23 May 2006Harrison 170 Tuesday 2pmInformatics RI
- Satellite image processing on the Grid structures (MEDIOGRID project) - Location based services - Graphical modelling and simulation of 3D textile surfaces - Pen based annotation in 2D and 3D graphical applications

 

Multilevel Classifier Systems - Issues, Motivations and Challenges
Dr Bogdan Gabrys (Bounemouth University)
6 Apr 2006Harrison 170 Tuesday 2pmInformatics RI
The talk will be concerned with pattern classification problems and in particular the motivations and challenges behind using and designing multiple classifier systems. Starting with an attempt to answer the question of why would one like to combine classifiers we will move to an overview of how to combine them. This will be followed by discussing the issues of majority voting limits and an illustration of the potential enhancement for theoretical error limits when using hierarchical multistage organisations for majority vote based combination systems. The talk will also cover the issues of classifier diversity, classifier selection and will finish with discussions of various practical evolutionary algorithm based implementations of multistage, multilevel and multidimensional selection-fusion models.

 

TBA
Professor Mario Alexandre Teles de Figueiredo
2 Jun 2005Harrison 254 Tuesday 3pmComputer Science

 

THIS SEMINAR HAS BEEN CANCELLED 'Automated feature detection and classification in Solar Feature Catalogues'.
Dr Valentina Zharkova
26 May 2005Harrison 254 Tuesday 3pmComputer Science
The searchable Solar Feature Catalogues (SFC) developed using automated pattern recognition techniques from digitized solar images are presented. The techniques were applied for detection of sunspots, active regions,filaments and line-on-sight magnetic neutral lines in the automatically standardized full disk solar images in Ca II K1, Ca II K3 and Ha taken at the Meudon Observatory and white light images and magnetograms from SOHO/MDI. The results of automated recognition were verified with the manual synoptic maps and available statistical data that revealed good detection accuracy. Based on the recognized parameters a structured database of the Solar Feature Catalogues was built on a mysql server for every feature and published with various pre-designed search pages on the Bradford University web site http://www.cyber.brad.ac.uk/egso/. The SFCs with a coverage of 10 years (1996-2005) are to be used for the solar feature classification and activity forecast, the first classification attempts will be discussed.

 

Computation without representation, and other mysteries
Derek Partridge (Department of Computer Science)
13 May 2005Harrison 171 Friday 4pmComputer Science (Internal)
The talk will cover the necessity for software that is not, and cannot be, faultless, but is, nevertheless, optimal. In particular, the need for inductive software technologies and why it may be impossible to track down known bugs. Then we move on to computations that know when they're wrong (to cope with the inevitable erroneous outputs), and a final generalization into the philosophical notion of accurately approximate computation as an alternative to precisely correct/incorrect computation.

 

THIS SEMINAR HAS BEEN CANCELLED
Using perceptual models to improve fidelity and provide invariance to valumetric scaling for quantization index modulation watermarking
Professor Ingemar Cox
5 May 2005Harrison 254 Tuesday 3pmComputer Science
Quanitization index modulation (QIM) is a computationally efficient method of watermarking with side information. This paper proposes two improvements to the original algorithm.
First, the fixed quantization step size is replaced with an adaptive step size that is determined using Watson's perceptual model. Experimental results on a database of 1000 images illustrate significant improvements in both fidelity and robustness to additive white Gaussian noise.
Second, modifying the Watson model such that it scales linearly with valumetric (amplitude) scaling, results in a QIM algorithm that is invariant to valumetric scaling. Experimental results compare this algorithm with both the original QIM and an adaptive QIM and demonstrate superior performance.

 

The Soft Machines: Computing with the Code of Life
Martyn Amos (Department of Computer Science)
18 Mar 2005Harrison 171 Friday 4pmComputer Science (Internal)
Cellular computing is a new, rapidly expanding field of research at the intersection of biology and computer science. It is becoming clear that, in the 21st century, biology will be characterized more often as an information science. The flood of data generated first by the various genome projects, and now by large-throughput gene expression, has led to increasingly fruitful collaborations between biologists, mathematicians and computer scientists. However, until recently, these collaborations have been largely one-way, with biologists taking mathematical expertise and applying it in their own domain. With the onset of molecular and cellular computing, though, flow of information been growing in the reverse direction. Computer scientists are now taking biological systems and modifying them to obtain computing devices. Cells are being re-engineered to act as microbial processors, smart sensors, drug delivery systems and many other classes of device. This talk traces the brief history of cellular computing and suggests where its future may lie.

 

RSS Local Group Meeting
Bayesian spatial partition modelling in epidemiological case-control studies
Carmen Fernández (Lancaster University)
10 Mar 2005Exeter University Laver, 321 Thurs 2pmStatistics & Operational Research
The talk considers developments in the modelling of case-control data where each sample individual is associated with a geographical location. A logistic regression framework is used and residual spatial variation is flexibly accommodated via the use of Voronoi tessellations with unknown numbers and locations of tiles. Modifications for matched case-control studies are also discussed.

 

How computer science can reveal problems with the standard model of cancer and can identify new models of cancer
Ajit Narayanan (Department of Computer Science)
4 Mar 2005Harrison 171 Friday 4pmComputer Science (Internal)
The dominant paradigm of cancer development is that mutations to a small number of genes transform a healthy cell into a cancerous cell by blocking normal pathways or making other pathways hyperactive. Specific molecular pathways ('subway lines') are claimed to be responsible for programming these behaviours. My work on cancer gene expression data does not support this view.

While my own research aim is also to come up with networks and maps of cancer, what distinguishes my approach from the dominant paradigm and its alternatives is that, in my approach, a specific pathway may be normal or cancerous depending on the expression values of genes making up the pathway. We do not need to assume that a cancer pathway is like a tube-train taking a different and unscheduled route from normal. Instead we can assume that the same route may be used in both normal and cancer cells. What differs is the number of trains passing through each station. It may not even matter that some of these trains are not fully formed trains (i.e. are mutated). What matters is the volume of traffic along the route: a route that is normal one day can become cancerous another day if the volume of traffic along the route changes significantly. If this is right, this will lead to a new view of cancer development and progression that has immediate and very different implications for possible therapeutic intervention and prevention.

No knowledge of biology is assumed. The talk will introduce the dominant paradigm of cancer and present some of our work on the alternative, leaving room for discussion and speculation.

 

Towards an Evolutionary Computation Approach to the Origins of Music
Dr Eduardo Reck Miranda
3 Mar 2005Harrison 254 Tuesday 3pmComputer Science
Evolutionary Computation (EC) may have varied applications in Music. This paper introduces three approaches to using EC in Music (namely, engineering,creative and musicological approaches) and discusses examples of representative systems that have been developed within the last decade, with emphasis on more recent and innovative works. We begin by reviewing engineering applications of EC in Music Technology such as Genetic Algorithms and Cellular Automata sound synthesis, followed by an introduction to applications where EC has been used to generate musical compositions. Next, we introduce ongoing research into EC models to study the origins and evolution of music and detail our own research work on modelling the evolution of musical systems in virtual worlds.

 

Bayesian Averaging over Decision Trees
Vitaly Schetinin (Department of Computer Science)
25 Feb 2005Harrison 171 Friday 4pmComputer Science (Internal)
Bayesian averaging (BA) over classification models allows analysts to estimate the uncertainty of classification outcomes within prior knowledge. By averaging over ensemble of classification models, the class posterior distribution, e.g. its shape and parameters, can be estimated and used by analysts to judge on the confident intervals. However the standard Bayesian methodology has to average all possible classification models that makes this methodology computationally infeasible for real-world applications.

The feasible way of implementing the BA is the use Markov Chain Monte Carlo (MCMC) technique of random sampling from the posterior distribution. Within the MCMC technique the parameters of classification model are drawn from the given priors. The proposed models are accepted or rejected accordingly to a Bayes rule. When the class posterior distribution becomes stable, the classification models are collected and their classification outcomes are averaged.

Regarding to Decision Tree (DT) classification models which provide a trade-off between the classification accuracy and interpretability, there are three questions which still remain open. The first is the condition under which the Markov Chain can make reversible moves and guarantee that the MCMC can explore DTs with different parameters within the given priors. The second question is how to avoid local minima during the MCMC search. The final third question is how to select a single DT from the DT ensemble which could be used for interpretation. All these three problems will be discussed and the experimental results obtained on some real world data (e. g. the UCI Machine Learning Repository, StatLog, the Trauma data etc) will be presented.

 

Postgraduate Seminar
Some investigations in discriminant analysis with mixed variables
Nor Idayu Mahat (University of Exeter)
24 Feb 2005Exeter University Laver, 321 Thurs. 2pmStatistics & Operational Research
The location model has been developed for the treatment of mixed categorical and continuous variables in discriminant analysis. In its recent development, Asparoukhov and Krzanowski (2000) suggested estimating the parameters of the location model by using non-parametric smoothing procedures. This approach overcomes some deficiencies in Maximum Likelihood Estimation and Linear Model Estimation. However, the choice of smoothing parameters by maximising the leave-one-out pseudo-likelihood function suggested in this approach depends on distributional assumptions This talk will describe how the smoothing parameters can instead be chosen by optimising either the error rate or the Brier score, neither of which make distributional assumptions. Some investigations on other possible smoothing procedures also will be discussed.

 

A Dominance-Based Mapping from Multi-Objective Space for Single Objective Optimisers
Kevin Smith (Department of Computer Science)
18 Feb 2005Harrison 171 Friday 4pmComputer Science (Internal)
Traditional optimisation research has concentrated on the single-objective case, where one measure of the quality of the system is optimised exclusively. Most real-world problems, however, are constructed from multiple, often competing, objectives which prevent the use of single-objective optimisers. Here a generic mapping to a single objective function is proposed for multi-objective problems, allowing single-objective optimisers to be used for the optimisation of multi-objective problems. A simulated annealer using this method is proposed which uses this technique and is shown to perform well on both test and commercial problems.

 

Identifying Familiarity and Dialogue Type in Transcribed Texts
Andrew Lee (Department of Computer Science)
11 Feb 2005Harrison 171 Friday 4pmComputer Science (Internal)
In a spoken dialogue, there is a lot of information that is not explicitly stated but can be identified through non-linguistic features, such as tone of voice or a change in speaker. However, this information is not always available when the conversation is transcribed into a written text.

In this talk, I'll be describing methods for measuring two aspects of dialogues that can be lost when transcribed: the familiarity between participants and the type of dialogue.

In the case of familiarity, Dialogue Moves are counted for conversational transcripts from the Map Task corpus. The differences in Dialogue Move pair distributions are compared between transcripts where participants are either familiar or unfamiliar with each other to explore whether a measure of familiarity can be based on this approach.

To identify the type of dialogue, the frequency distribution of Verbal Response Modes in a transcribed text are counted for number of different dialogues, including interviews, presentations and speeches. Profiles generated from the frequency distributions are then be used as a basis for comparison to identify the closest matching dialogue type.

 

ROC Optimisation of Safety Related Systems
Jonathan Fieldsend (Department of Computer Science)
4 Feb 2005Harrison 171 Friday 4pmComputer Science (Internal)
In this talk the tuning of critical systems is cast as a multi-objective optimisation problem. It is shown how a region of the optimal receiver operating characteristic (ROC) curve may be obtained, permitting the system operators to select the operating point. This methodology is applied to the STCA system, showing that the current hand-tuned operating point can be improved upon, as well as providing the salient ROC curve describing the true-positive versus false-positive trade-off. In addition, through bootstrapping the data we can also look at the effect of data uncertainty on individuals parameterisations, and the ROC curve as a whole.

 

Ultrasound Image Segmentation
Professor Alison Noble (University of Oxford)
3 Feb 2005Harrison 254 Tuesday 3pmComputer Science
Ultrasound image segmentation is considered a challenging area of medical image analysis as clinical images vary in quality and image formation is nonlinear. Most classicial approaches to segmentation do not work well on this data.
In this talk, I will provide an overview of research we have done in this area, using what I call weak physics based approaches, specifically looking at the tasks of displacement estimation, and segmentation of tissue regions and perfusion uptake.

 

Adopting Open Source tools in a production environment: are we locked in?
Brian Lings (Department of Computer Science)
21 Jan 2005Harrison 171 Friday 4pmComputer Science (Internal)
Many companies are using model-based techniques to offer a competitive advantage in an increasingly globalised systems development industry. Central to model-based development is the concept of models as the basis from which systems are generated, tested and maintained. The availability of high-quality tools, and the ability to adopt and adapt them to the company practice, are therefore important qualities. Model interchange between tools becomes a major issue. Without it, there is significantly reduced flexibility, and a danger of tool lock-in. In this talk I report on a case study in which a systems development company, SAAB Combitech, has explored the possibility of complementing their current proprietary tools by exporting models in XMI to open source products for supporting their model-based development activities. We found that problems still exist with interchange, and that the technology needs to mature before industrial-strength model interchange becomes a reality.

 

TBA
Dr Sunil Vadera
20 Jan 2005Harrison 209 Tuesday 3pmComputer Science

Much of our daily reasoning appears to be based on stereotypes, exemplars and anecdotes. Yet, basic statistics informs us that decisions based only limited data are, at best, likely to be inaccurate, if not badly wrong. However, exemplars and stereotypes are not arbitrary data points, they are based on experience and represent prototypical situations. The ability to predict the behaviour of a consumer, observe that two people are related, diagnose an illness, and even how an MP might vote on a particular issue, all depend on a person's past experience - that is the exemplars and stereotypes a person learns.

If this hypothesis, namely that we can form and reason well with exemplars is true, we should be able to identify exemplars from data. To achieve this, we need to answer the following questions: (a) What is an exemplar and how can it be represented? (b) How do we learn good exemplars incrementally? (c) How can exemplars be used?

This seminar presents an approach to these questions that involves the use of the notion of family resemblance to learn exemplars and Bayesian networks to represent and utilise exemplars. Empirical results of applying the model will be presented and relationships with other models of machine learning also discussed.

 

MODELLING THROUGH: The growing role of decision analytic models in Health Technology Assessment
Martin Pitt (Department of Computer Science)
14 Jan 2005Harrison 171 Friday 4pmComputer Science (Internal)
In recent years, analytic modelling has been widely used to support decision making within the National Health Service (NHS) for a range of applications and with varying levels of success.

A recent development has been the adoption of mathematical and computer based models as a central element if the process of Health Technology Assessment (HTA). HTA is concerned with the evaluation of alternative healthcare interventions (eg alternative drug therapies) in order to directly inform decision making. It is now at the forefront of health research in the UK representing the largest single strand of NHS funded research activity. HTA outputs form a key element in the process by which the National Institute for Clinical Excellent determine general guidelines for UK prescription and clinical practice.

This presentation will take an informal look at the rapidly developing field of decision analytic modelling in field HTA. It will outline the range of alternative mathematical and computer based approaches adopted with reference to case study examples, and illustrate how model outputs feed into the decision making process. Key challenges within this field, such as how data uncertainty is handled, will be examined in relation to current areas of active development.

 

***Jonas Gamalielsson - Developing a Method for Assessing the Interestingness of Rules Induced from Microarray Gene Expression Data ***Zelmina Lubovac - Revealing modular organization of the protein interactome by combining its topological and functional properties ***Simon Lundell - Modelling the Heamatopoietic Stem Cell System.
Jonas Gamalielsson, Simon Lundell and Zelmina Lubovac
17 Dec 2004Harrison 209 Wednesday 3pmComputer Science
***Jonas Gamalielsson - Abstract :The aim and contribution of this work is to develop a method for assessing the interestingness of rules induced from microarray gene expression data using a combination of objective and domain specific measures, which will assist biologists in finding the most interesting hypotheses to test after mining for rules in microarray gene expression data and also generate more accurate models than if objective measures alone were used. More specifically, a method is being developed for assessing the biological plausibility of hypothetic regulatory relations generated by data mining algorithms applied to gene expression data. The idea is to use an information fusion approach where knowledge is used about the Gene Ontology functional classification of gene products and the topology of known regulatory pathways with the purpose of generating templates representing general knowledge of regulation in pathways. Templates show what kind of gene products with respect to molecular function that have been found to participate in different types of regulatory relations in pathways. A training set of regulatory relations is used to derive the templates. A test set of hypothetic regulatory relations is used to assess how well the templates can distinguish between hypothetic relations showing a high and low level of biological plausibility with respect to the set of training relations. ***Zelmina Lubovac- Abstract :Understanding the structure of the protein interaction network is useful as a first step towards revealing the underlying principles of the large-scale organisation of the cell. In this project, we analyse a topological characterisation of the yeast (/Saccharomyces cerevisiea/) protein-protein interaction network, and relate it to functional annotations from Gene Ontology (GO). We aim to develop a biologically informed measure to reveal modular formations in an interactome. A semantic similarity measure has been used to assess the role of hubs in a network in terms of functional annotation from GO. The existing graph theoretic notion of the module has been used in previous work to perform a modular decomposition of the protein network, i.e. to break down sub-graphs into a hierarchy of nested modules or units that groups proteins with common functional roles. Our attempt is to complement the existing graph theoretic approach with semantic similarity based on proteins? ontological terms to achieve more biologically plausible descriptions of modular decomposition. ***Simon Lundell -Abstract :During life of higher animals the heamatopoietic system produces blood, with a composition of various cell types, e.g. lymphocytes, erythrocytes and platelets. This system has to adopt to the animals growth and for changes such as stress conditions, e.g. infection and blood loss. If the animal is infected then a new composition of blood cells is needed as well as a compensation for the cells lost in the battle against the intruder. The heamatopoietic system is a highly adaptable and has stem cells that are dormant in long periods of time. The dormant cells can,when needed, be proliferated, and may then give rise to millions of new cells. The regulation of the number of heamatopoietic cells is crucial to animals; the heamatopoietic system must avoid depletion of stem cells as well as excess production of cells, both instances are life threatening conditions. A mouse produces 60% of its body weight in blood cells in its lifetime and a human produces 10 times its body weight. Although the large production of blood cells only a few stem cells has been stated as necessary to recreate hematopoietic system. Using an object based model of this system we were able to reproduce experimental data, and were able to find out which types of feedback regulations are active during stem cell transplantation. The systems intricate organisation, dynamic structure and the need to compare the simulations to a large number of experimental setups, opts for a new way of organising these models.

 

Wiggly Outlines: The Ins and Outs of Non-convex Hulls
Antony Galton (Department of Computer Science)
17 Dec 2004Harrison 171 Friday 4pmComputer Science (Internal)
Given a set of objects located in space, how can we represent the region occupied by those objects at different granularities? At a very fine granularity, the region simply consists of all the points occupied by the objects themselves, whereas at a very coarse granularity it might be given by the convex hull of those points. In many cases, neither of these representations is very useful: what we want is some kind of non-convex hull to convey the overall 'shape' of the region. But whereas the convex hull of a set of points is uniquely defined, there are any number of candidates for their non-convex hull. In this talk I shall introduce and explore some of the properties of a family of non-convex hulls generated by a generalisation of a simple convex-hull algorithm.

 

A personal view of computer assisted formal reasoning
Matthew Fairtlough (University of Sheffield)
10 Dec 2004Harrison 171 Friday 4pmComputer Science (Internal)
I will survey my work of the last 13 years in the field of logic and formal reasoning; in particular I will discuss the methods, logics, implementations, tools and applications involved. The most striking development has been a new modal logic called Lax Logic, so my talk will focus on this. I will give a flavour of the logic itself and of its application to formal reasoning across (multiple) abstraction boundaries, where inevitably constraints arise whose values may not be precisely known in advance. I will end by presenting some recent work using Mathematica to analyse and animate a dynamic 3-dimensional lattice.

 

Seeing where the other guy is coming from: A survival guide for young researchers working in multi-disciplinary subjects.
James Hood (Department of Computer Science)
3 Dec 2004Harrison 171 Friday 4pmComputer Science (Internal)
Part of the confidence I have gained recently in my own work is the result of trying to see where different disciplinary groups in my general field of research are coming from -- what they are trying to gain in doing what they are doing. This action, which has led to a better understanding of my own research goals, was not possible until I situated my own research against that of a greater research community. The upshot of this exercise was my very own 'research map', on which, as the product of generalisation of people and groups, I felt I had a better idea of not only where the different groups where coming from, but also where I was going to. In this talk I want to present to you my research map and my experiences of and the lessons I have learned from creating it. Bridge building in multi-disciplinary research is now so important that all of us could benefit from doing a little map making every now and then. I hope therefore that this talk will be of some interest.

 

Postgraduate Seminar
Spatial Survival Analysis and the FMD epidemic in Devon
Trevelyan McKinley (University of Exeter)
2 Dec 2004Exeter University Laver, 321 Thurs 2pmStatistics & Operational Research
The magnitude of the potential economic impacts of epidemic animal disease events have been highlighted in recent years through outbreaks such as the foot-and-mouth disease epidemic in the United Kingdom during 2001. This talk reports some initial work from a project in conjunction with the Veterinary Laboratory Agency, Weybridge, which is looking at the feasibility of using survival modelling to develop dynamic space-time predictions of survivor and hazard functions for individual farm premises as an animal disease epidemic progresses. Survival analysis used in a spatial context is a potentially useful approach to quantifying the risk of infection of susceptible premises within future time periods given the characteristics of these premises and their geographical location relative to potential sources of infection. Results from such analyses could provide powerful insights into the patterns of infection, such as regional differences in the dynamics of the epidemic, and can assist in optimising various aspects of the operational response activities, such as targeting of at-risk farms. We consider various possible model formulations and apply a range of these to data from Devon for the 2001 UK FMD epidemic.

 

Explanatory Shifts and structures for knowledge
Donald Bligh (Department of Computer Science)
19 Nov 2004Harrison 171 Friday 4pmComputer Science (Internal)
Explanations involve fitting what is to be explained into a context that has already formed in the minds or brains of he who explains and he who tries to understand. But how did they come to understand that preformed context? By fitting it into an earlier preformed context and so on ad infinitum?

Well no, not quite. Eventually you reach fundamental elements of preformed contexts of which there are at least ten: sameness, difference, change, direction, force, awareness, like/dislike, obligation and intention. Just as items of knowledge are built upon previous knowledge, so their preformed contexts build upon each other. The elements build upon each other in ever more complex ways, a bit like complex molecular structure built of simple elements.

When asked to justify an explanation the reverse process of occurs – analysis or deconstruction of the explanation. If so the justification is in a different context from the explanation being justified. That is what I call an explanatory shift.

 

RSS Local Group Meeting:
Competing Risks: a brief introduction
Martin Crowder (Imperial College)
18 Nov 2004Exeter University Laver, 321 Thurs 2pmStatistics & Operational Research
The origins of Competing Risks date back to Bernoulli's attempt in 1760 to disentangle the risk of dying from smallpox from other risks. Much subsequent work has been demographic and actuarial in nature and although obviously of potential relevance in Reliability and Survival Analysis, applications to those fields are quite recent. The talk will cover some of the basic ideas and application of the subject.

 

Theory of Molecular Computing. Splicing and Membrane systems
Pier Frisco (Department of Computer Science)
12 Nov 2004Harrison 171 Friday 4pmComputer Science (Internal)
Molecular Computing is a new and fast growing field of research at the interface of computer science and molecular biology driven by the idea that molecular processes can be used for implementing computations or can be regarded as computations. This research area has emerged in recent years not only as a novel technology for information processing, but also as a catalyst for knowledge transfer between the fields of information processing, nanotechnology and biology. Molecular Computing (together with research areas such as Quantum Computing, Evolutionary Algorithms and Neural Networks) belongs to Natural Computing which is concerned with computing taking place in nature and computing inspired by nature. In this talk I will give an overview of my research in the theoretical aspects of Molecular Computing. In particular I will talk about two theoretical models of computation: splicing systems and membrane systems.

 

Statistics Seminar
Limited and full information estimation and goodness-of-fit testing in 2^n contingency tables: A unified framework
Albert Maydeu-Olivares (Faculty of Psychology, University of Barcelona)
12 Nov 2004Exeter University Laver B74 Friday 4pmStatistics & Operational Research
High-dimensional contingency tables tend to be sparse and standard goodness-of-fit statistics such as chi-square cannot be used without pooling categories. As an improvement on arbitrary pooling, for goodness-of-fit of large 2^n contingency tables, we propose classes of quadratic form statistics based on the residuals of margins or multivariate moments up to order r. These classes of test statistics are asymptotically chi-square. Further, the marginal residuals are useful for diagnosing lack of fit of parametric models. We show that when r is small (r = 2,3) the proposed statistics have better small sample properties and are asymptotically more powerful than chi-square for some useful multivariate binary models. Related to these test statistics is a class of limited information estimators based on low-dimensional margins. We show that these estimators have high efficiency for one commonly used latent trait model for binary data (the two parameter logistic IRT model)

 

Fractals and Image Processing
Professor Revathy
11 Nov 2004Harrison 209 Tuesday 3pmComputer Science

 

A Painless Introduction to Mereogeometry
Dr Ian Pratt-Hartmann (University of Manchester)
4 Nov 2004Harrison 209 Tuesday 3pmComputer Science
One of the many achievements of coordinate geometry has been to provide a conceptually elegant and unifying account of spatial entities. According to this account, the primitive constituents of space are points, and all other spatial entities---lines curves, surfaces and bodies---are nothing other than the sets of those points which lie on them. The success of this reduction is so great that the identification of all spatial objects with sets of points has come to seem almost axiomatic. For most of the previous century, however, a small but tenacious band of authors has suggested that more parsimonious and conceptually satisfying representations of space are obtained if we adopt an ontology in which regions, not points, are the primitive spatial entities. These, and other, considerations have prompted the development of formal languages whose variables range over certain subsets (not points) of specified classes of geometrical structures. We call the study of such languages `mereogeometry'. In the past decade, the Computer Science community in particular has produced a steady flow of new technical results in mereogeometry, especially concerning the computational complexity of region-based topological formalisms with limited expressive power. The purpose of this talk is to survey this work in general and (largely) non-technical terms. In particular, we aim to locate the various recent mathematical results in mereogeometry within a general mathematical framework. The result will be an inventory of stock and a list of open problems.

 

TBA
Professor Peter Cowling (Bradford University)
10 Jun 2004Harrison 209 Tuesday 3pmComputer Science

 

Machine Consciousness
Dr Owen Holland (University of Exeter)
3 Jun 2004Harrison 209 Tuesday 3pmComputer Science

 

The work at the Hadley Centre for Climate Prediction and Research, Met Office
Dr Vicky Pope (The Met Office)
26 May 2004Harrison 209 Tuesday 3pmComputer Science

 

Agents and affect: why embodied agents need affective systems.
Professor Ruth Aylett (Salford University)
13 May 2004Harrison 209 Tuesday 3pmComputer Science

 

Logic-based visual perception for a humanoid robot
Dr Murray Shanaham (Imperial College)
6 May 2004Harrison 209 Tuesday 3pmComputer Science

 

RSS Local Group Meeting:
Measurement error, power and sample size in gene-environment interaction studies
Jian'an Luan (University of Cambridge)
11 Jun 2003Laver None NoneStatistics & Operational Research
See here for abstract

 

RSS Local Group Meeting:
Machine Learning Techniques for Bioinformatics
Colin Campbell (University of Bristol)
29 May 2003Laver None NoneStatistics & Operational Research
See here for abstract

 

Operational Research Seminar:
A cash flow criterion for controlling a base stock inventory system
Roger Hill (University of Exeter)
22 May 2003Laver None NoneStatistics & Operational Research
It is common practice to control certain inventory systems using a 'base stock' policy and the standard approach is then to associate time-weighted costs with holding stock and with having unsatisfied demand and to determine the base stock which minimises the long run average total cost per unit time. This approach ignores the impact of the control policy on the timing of the cash flows associated with payments made to suppliers and revenues received from customers. The approach made here is to concentrate on cash flows and determine the control policy which optimises an appropriate cash flow measure. The impact of the control system is measured in two ways. Firstly, if a customer order is met immediately from stock then we determine for how long that unit has been held in stock and compute the corresponding compound loss of interest resulting from having paid the supplier for it early. Secondly, if a customer order is not met from stock then we determine the delay in meeting that demand and the consequent loss of interest which could have been earned on the customer payment if it had not been delayed. The objective is to minimise the total expected cash flow impact per unit time. A solution procedure is given and a comparison is made between this approach and other ways of controlling this system.

 

Statistics Seminar:
Bootstrapping SIR for Dimension Assessment in a General Regression Problem
Santiago Velilla (Universidad Carlos III de Madrid)
8 May 2003Laver None NoneStatistics & Operational Research
A bootstrap method is constructed for assessing the dimension of a general regression problem. A resampled version of the matrix used in the SIR method of Li (1991) is obtained, and the bootstrap distributions of the statistics of interest characterized. The proposed methodology incorporates both formal and graphical inference procedures and can be considered as an alternative to the permutation test of Cook and Yin (2001).

 

Statistics Seminar:
Nonparametric classification exploiting separation of populations
Adolfo Hernandez (University of Exeter)
13 Mar 2003Laver None NoneStatistics & Operational Research
Kernel discriminant analyisis is greatly affected by the well-known phenomenon often referred to as the 'curse of dimensionality'. This causes bad behaviour of the rules because of the amount of data required as the dimension of the problem increases. In this seminar two dimension reduction methods are proposed, based on the concept of separation of populations. The basic idea is firstly to obtain a dimension reduction subspace through the maximization of certain functionals which can be seen as indexes of separation, and, secondly, to evaluate a reduced kernel discriminant rule in that subspace. The good behaviour of these methods is justified both theoretically and also through application to data sets where comparisons with other methods proposed in the literature can be established.

 

RSS Local Group Meeting:
Markov Chain Monte Carlo exact inference for binomial and multinomial logistic regression models
John MacDonald (University of Southampton)
27 Feb 2003Laver None NoneStatistics & Operational Research
See here for abstract

 

Statistics Seminar:
Spatial modelling of childhood malaria in the Gambia
Rana Moyeed (University of Plymouth)
20 Feb 2003Laver None NoneStatistics & Operational Research
A spatial generalized linear mixed model is developed to describe the variation in malarial prevalence amongst a sample of village resident children in the Gambia. The response from each child is a binary indicator of the presence of malarial parasites in a blood-sample. The model includes terms for the effects of child-level covariates, village-level covariates and separate components for residual spatial and non-spatial extra-binomial variation. The results show that the extra-binomial variation is spatially structured, suggesting an environmental effect rather than variation in familial susceptibility. The method of inference was Bayesian using vague priors and a Markov chain Monte Carlo implementation.

 

RSS Local Group Meeting:
The Forward Search and the Analysis of Multivariate Data
Anthony Atkinson (London School of Economics)
6 Feb 2003Laver None NoneStatistics & Operational Research
See here for abstract

 

Operational Research Seminar:
An Overview of OR in Sport
Chris Potts (University of Southampton)
16 Jan 2003Laver None NoneStatistics & Operational Research
This talk reviews some contributions that operational research (OR) has made in sport. The contributions roughly fall within the areas of planning and strategy, scheduling, ranking and performance measurement, and prediction. In planning and strategy, we discuss the use of dynamic programming for optimising batting strategies in one-day cricket, and describe how pit stop strategies are determined in formula one motor racing. For scheduling sports fixtures, we indicate how OR has been used in county cricket. Different sports use different methods for measuring the performance of players/teams. We comment briefly on the need for more robust systems. Finally, prediction is important for bookmakers and is of interest to sports enthusiasts. The modelling of the results of soccer games is discussed.

 

Postgraduate Seminar:
Clustering Gene Expression Data
Heather Turner (University of Exeter)
12 Dec 2002Laver None NoneStatistics & Operational Research
Gene expression data, resulting from the relatively new microarray technology, has many features which make it difficult to analyse. Many traditional techniques fail due to the size of the data sets or the lack of conformity to common assumptions, such as normality or independence.Clustering has proven to be a useful method of discovering functional grouping of genes, one of the main objectives of microarray experiments. A number of clustering algorithms have been specifically developed for gene expression data, designed to be more efficient and more flexible than standard algorithms. Tne of these algorithms is the plaid model, a two-way overlapping clustering method proposed by Lazzeroni and Owen (2002). This talk introduces the plaid model and proposes an alternative optimisation algorithm that may improve its efficiency. Possible extensions of the model will also be discussed.

 

RSS Local Group Meeting:
Predicting Reliability for Orthopaedic Hip Replacements
Simon Wilson (Trinity College Dublin & Carlos III University Madrid)
4 Dec 2002Laver None NoneStatistics & Operational Research
See here for abstract

 

RSS Local Group Meeting:
Challenges in Bioinformatics for Statisticians
Wally Gilks ('MRC Biostatistics Unit, Cambridge')
21 Nov 2002Laver None NoneStatistics & Operational Research
See here for abstract

 

Statistics Seminar:
Denoising real data using complex wavelets
Stuart Barber (University of Bristol)
7 Nov 2002Laver None NoneStatistics & Operational Research
Wavelet shrinkage is an effective nonparametric regression technique when the underlying curve has irregular features such as spikes or discontinuities. The basic idea is simple: take the discrete wavelet transform (DWT) of data consisting of a signal corrupted by noise; shrink the wavelet coefficients to remove the noise; and then invert the DWT to form an estimate of the true underlying curve. Various authors have proposed methods of doing this using real-valued wavelets. Complex-valued versions of some wavelets exist, but are rarely used. We propose two shrinkage techniques which use complex wavelets. Simulation results show that both methods give smaller errors than using state of the art shrinkage rules with real-valued wavelets.

 

Statistics Seminar:
Bayesian rule based classification
Chris Holmes (Imperial College London)
30 Oct 2002Laver None NoneStatistics & Operational Research
We describe a new method for statistical pattern recognition that is based on probabilistic rule sets. The model constructs a set of first-order rules of the form IF A THEN B where the antecedent A relates to conditions on a set of predictor measurements x and the consequence B relates to changes in the odds function of the conditional probability p(y/x) for a category label y. Rule-set models are highly expressive and interpretable and a key feature of the method is the ease by which expert domain knowledge can be incorporated into the classifier system. A Bayesian framework is used which places a prior distribution over the state space of all probabilistic rule sets. Inference proceeds using stochastic simulation via tailored Markov chain Monte Carlo algorithms. The methodology is illustrated using examples taken from the machine learning literature where typically we have tens or hundreds of predictors and hundreds or thousands of observations.

 

Operational Research Seminar:
Inventory - too much , too little, or right on?
Geoff Relph (Manchester Business School)
24 Oct 2002Laver None NoneStatistics & Operational Research
Inventory has a major impact on business performance. How do you achieve that elusive balance - customer satisfaction and lower inventory? This talk examines some of the issues involved in better inventory planning. The concept of 'overage inventory' is defined and developed. Case research on inventory management in a manufacturing company is discussed. Options for evaluating and estimating the value of overage and determining simple prioritisation techniques for the desired corrective action needed to reduce the overage are examined. The pragmatic balance between purist academic views of inventory management and the instinctive approach often used in a small business are considered.

 

RSS Local Group Meeting:
Does the weather God play dice?
David Stephenson (University of Reading)
23 May 2002Laver None NoneStatistics & Operational Research
See here for abstract

 

RSS Local Group Meeting & AGM:
Something in the air? Multivariate analysis and atmospheric science
Ian Jolliffe (University of Aberdeen)
16 May 2002Laver None NoneStatistics & Operational Research
See here for abstract

 

Operational Research Seminar:
Base stock inventory policies
Mundappa Pakkala (University of Mangalore)
2 May 2002Laver None NoneStatistics & Operational Research
We consider an inventory model in continuous time. Demand follows a Poisson process and demand during a stockout is backordered. The stock level is controlled by means of a base stock policy in which the balance of physical stock plus stock on order less backorders is maintained at the base stock level. Therefore if a demand occurs then an order for replacement stock is placed immediately. The time for a replacement order to arrive is the lead time - this may be fixed or it may vary. We consider here two variants on this basic policy. First, multi-item demand processes. Second, modelling the cash flows. In each case we discuss the context of the problem, its mathematical formulation and a procedure for finding the optimal solution.

 

Operational Research Seminar:
Locating ambulances in Riyadh: theoretical developments and practical application
Graham Rand (University of Lancaster)
14 Mar 2002Laver None NoneStatistics & Operational Research
The location of Emegency Medical Services (EMS) is an important problem. Good locations, enabling rapid response, can save lives. Typical OR modelling for these problems tries to improve coverage which is defined as the ability to travel from a service station to a demand point in a pre-specified time. A model was developed to evaluate locations for the Saudi Arabian Red Crescent Society (SARCS), Riyadh City, Saudi Arabia. In this model the usual 0-1 coverage definition (i.e. the demand is covered or not) is replaced by the probability of covering a demand within the target time. Second, once the locations are determined, the minimum number of vehicles at each location that satisfies the required performance levels is determined. Thus, the problem of identifying the optimal locations of a pre-specified number of emergency medical service (EMS) stations is addressed by goal programming. The first goal is to locate these stations so the maximum expected demand can be reached within a pre-specified target time. Then, the second goal is to ensure that any demand arising located within the service area of the station will find at least one vehicle, such as an ambulance, available. Erlang's loss formula is used to identify the arrival rates when it is necessary to add an ambulance in order to maintain the performance level for the availability of ambulances. The use of the model for the Riyadh EMS will be described. This work was undertaken jointly with Othman Alsalloum

 

RSS Local Group Meeting:
Inference in fMRI experiments using spectral domain methods
Jonathan Marchini (University of Oxford)
28 Feb 2002Laver None NoneStatistics & Operational Research
See here for abstract

 

Postgraduate Seminar:
Lot sizing policies in an advance ordering environment
Lynette Frick (University of Exeter)
21 Feb 2002Laver None NoneStatistics & Operational Research
In most classic inventory models customer demand is either assumed to be deterministic or stochastic. In some multi-period applications, future demand is only

 

Statistics Seminar:
Disease mapping of stage-specific cancer incidence data
Leo Knorr-Held (University of Lancaster)
14 Feb 2002Laver None NoneStatistics & Operational Research
We propose two approaches for the spatial analysis of cancer incidence data with additional information on the stage of the disease at time of diagnosis. The two formulations are extensions of commonly used models for multicategorical response data on an ordinal scale. We include spatial and age group effects in both formulations, which we estimate in a nonparametric smooth way. More specifically, we adopt a fully Bayesian approach based on Gaussian pairwise difference priors where additional smoothing parameters are treated as unknown as well. We argue that the proposed methods are useful in monitoring the effectiveness of mass cancer screening and illustrate this through an application to data on cervical cancer in the former German Democratic Republic. The results suggest that there are large spatial differences in the stage-proportions, which indicates spatial variability with respect to the introduction and effectiveness of pap smear screening programs. This is joint work with G Rasser, University of Munich and N Becker, German Cancer Research Center Heidelberg.

 

Statistics Seminar:
Cross-validation in additive main effect and multiplicative interaction (AMMI) models
Carlos Tadeu dos Santos Dias (University of Sao Paulo/ESALQ)
7 Feb 2002Laver None NoneStatistics & Operational Research
The additive main effects and multiplicative interaction (AMMI) model has been proposed for the analysis of genotype/environmental data. For plant breeding, the recovery of pattern might be considered to be the principal objective of analysis. However, some problems still remain with the analysis, notably in selecting the number of multiplicative components in the model. Methods based on distributional assumptions do not have sound methodological basis, while existing data-based approaches do not optimise the cross-validation process. This talk will first summarise the AMMI model and outline the available methodology for selecting the number of multiplicative components to include in it. Then two new methods will be described that are based on a full leave-one-out procedure optimising the cross-validation process. Both methods will be illustrated and compared on some unstructured multivariate data. Finally, their application to analysis of GxE interaction will be demonstrated on experimental grain yield data.

 

RSS Local Group Meeting:
Anticipating catastrophes through extereme value modelling
Stuart Coles (University of Bristol)
24 Jan 2002Laver None NoneStatistics & Operational Research
See here for abstract

 

Statistics Seminar:
Geostatistical models and applications
Paulo Ribeiro (University of Lancaster)
17 Jan 2002Laver None NoneStatistics & Operational Research
The term 'geostatistics' identifies the part of spatial statistics which is concerned with continuous spatial variation. The term 'model-based geostatistics' was coined by Diggle, Tawn and Moyeed (1998) to mean the application of explicit parametric, stochastic models and formal, likelihood-based, methods of inference to geostatistical problems. Geostatistical methods are currently applied in a wide range of subjects and model-based methods provide further options to tackle challenging pratical problems. Motivated by some pratical applications, this talk discusses model-based geostatistical methods and their computational implementation.

 

Operational Research Seminar:
A bounding problem in inventory modelling
Roger Hill (University of Exeter)
10 Jan 2002Laver None NoneStatistics & Operational Research
A fundamental inventory model is the stochastic demand, periodic or continuous review, backorder model with linear holding, shortage and ordering costs and a general lead time on replenishment. It is well-established that the optimal control policy for this model is an (s,S) policy and efficient procedures exist for deriving this optimal policy. An important feature of most practical systems is that packaging and handling considerations require that replenishments must be in multiples of some unit of stock transfer q. This talk describes, in outline, the fundamental model and shows how the analysis can be adapted to allow for a general unit of stock transfer q. It finally raises some, as yet unresolved, issues on developing procedures for finding the optimal policy for this modified model.

 

Analysis of transplant survival rates
Dave Collett (University of Reading)
28 Nov 2001Laver None NoneStatistics & Operational Research

 

Should small firms be more cautious than large ones? - Dynamic programming models of operations management decisions in small firms.
Lyn Thomas (University of Southampton)
22 Nov 2001Laver None NoneStatistics & Operational Research
Operations management models, like inventory control and production levels have proved very successful in the operations of firms. However they all take as their objective the maximisation of profit or the minimisation of cost. For small firms it could be argued that maximising the probability of survival of the firm is the principal objective. This talk looks at how one can model the operations management decisions under this criterion using dynamic programming and compares the survival probability maximising decisions with the profit maximising ones. It suggests that small firms should be more cautious ( but not too cautious) than large firms.

 

Postgraduate Seminar: Statistical modelling of performance indicators
Paul Hewson (University of Exeter)
15 Nov 2001Laver None NoneStatistics & Operational Research
Performance Indicators are amongst the most widely published official statistics in the UK. It has been suggested that the UK central government has set over 5,000 targets against these statistics. In contrast to the wealth of numerical data available, less effort has been applied to the statistical analysis of the data. Most work performed to date has been in the educational and health fields, although considerable money has been spent evaluating data envelopment analysis for the Home Office to develop targets for police performance. Using two sets of Performance Indicators, relating to Housing Benefit Administration and Road Safety (reflecting output and outcome indicators), various methods for analysis will be reviewed, particularly approaches based upon generalised linear mixed models. Work in progress to account for the multivariate nature of the data in such models will also be described.

 

RSS Meeting: Modelling spatial-temporal processes for hydrology and climate
Valerie Isham (University College London)
7 Nov 2001Laver None NoneStatistics & Operational Research
A review will be given of some of the spatial-temporal models developed by an interdisciplinary team from University College and Imperial College for use in the context of hydrological design. Approaches using both point-process-based stochastic models and statistical, generalised linear, models (GLMs) will be described. A strength of the former models is their ability to represent high space-time resolution, while the latter more easily enable spatial and temporal nonstationarities to be incorporated. These models are also being used to investigate other climatological processes, such as temperature and wind speed, where there is a particular focus on questions of the influence of long-range effects (e.g., El Nino), and climate change.

 

RSS Meeting: Bayes 'n' drugs 'n' sporting role
Phil Brown (University of Kent)
25 Oct 2001Laver None NoneStatistics & Operational Research
A joint EU/IOC international project centered on St. Thomas's Hospital London has been looking at detection of growth hormone abuse in sport. We describe approaches to modelling multivariate markers of GH intake through time to discriminate between those that were treated with growth hormone and those on placebo in a double-blind study.

 

Estimating abundance from data containing many zeros
Alan Welsh (University of Southampton)
18 Oct 2001Laver None NoneStatistics & Operational Research
North East Herald Cay is a small but ecologically significant coral cay in the Coral Sea, about 350 km off the coast of Queensland, Australia. As part of the development of a monitoring program, we consider the problem of estimating the number of nests of different species of seabirds on North East Herald Cay based on surveys of 10mx10m quadrats along transects across the Cay. We consider three approaches based on different plausible models. Our main findings are that an approach based on a conditional negative binomial model which allows for additional zeros in the data works well and that a transform-both-sides regression approach produces badly biased estimates and should not be used. We discuss our experience of collecting the data, applying the methodology to the available data and discuss the implications for monitoring nesting on North East Herald Cay.

 

Statistics in sport and games
Frank Duckworth ('Editor, RSS News')
10 Oct 2001Laver None NoneStatistics & Operational Research

 

Special RSS Meeting and AGM at Plymouth University (Robbins Seminar Room 2): A Heretic's View of Placebos and Ethics in Clinical Trials
Stephen Senn ('University College, London')
31 May 2001Laver None NoneStatistics & Operational Research

 

Postgraduate Seminar: Using State Space models to investigate the effect of vitamin A supplement on diarrhoea
Valeska Andreozzi (University of Exeter)
24 May 2001Laver None NoneStatistics & Operational Research

 

Postgraduate Seminar: A Score Test for Zero-inflated Negative Binomial Models
Naratip Jansakul (University of Exeter)
24 May 2001Laver None NoneStatistics & Operational Research

 

The size of orders from customers, characterisation, forecasting and implications
Roy Johnston (Warwick University)
18 Apr 2001Laver None NoneStatistics & Operational Research

 

Using the Randomisation in Specifying the Mixed Models and ANOVA tables
Chris Brien (University of South Australia)
16 Mar 2001Laver None NoneStatistics & Operational Research

 

Postgraduate Seminar: Analysis of Multivariate Process Control Data
Julie Badcock (University of Exeter)
8 Mar 2001Laver None NoneStatistics & Operational Research

 

Postgraduate Seminar:The Evolution of Trees: Application of Genetic Algorithms to Network Optimisation
Evan Thompson (University of Exeter)
8 Mar 2001Laver None NoneStatistics & Operational Research

 

Use of discrete event simulation in the evaluation of screening for Helicobacter pylori for the prevention of peptic ulcers and gastric cancer.
Ruth Davis (University of Southampton)
1 Mar 2001Laver None NoneStatistics & Operational Research

 

RSS meeting: Estimating Mixtures of Regressions
Merrilee Hurn (University of Bath)
15 Feb 2001Laver None NoneStatistics & Operational Research

 

RSS meeting: Independent component analysis: flexible sources and non-stationary mixing
Richard Everson (University of Exeter (Computer Science))
25 Jan 2001Laver None NoneStatistics & Operational Research

 

RSS meeting: Independent component analysis: flexible sources and non-stationary mixing
Richard Everson (University of Exeter (Computer Science))
25 Jan 2001Laver None NoneStatistics & Operational Research