Crossrail is creating deep cavernous spaces underground right in the heart of central London. Here we look at how the project has coped with processing the massive amount of complex data produced to monitor the ground and surrounding buildings.
In a busy urban environment where complex and deep excavations are made just millimetres away from multi-storey or historical buildings, movements caused by the construction work is a big risk for any project. Now imagine that the project is Crossrail, where cavernous spaces are being created underground at hugely constrained sites across the heart of central London.
Over the last few years, better technology and an increased nervousness over the consequences of damaging some of the most expensive real estate in the world or disrupting the lives of those living and working in them, has made monitoring ground and adjacent building movements big business - with costs to projects increasing as a result. Automated instrumentation and increasing numbers of instruments, has resulted in a substantial increase in the quantity of data that requires interpretation.
Across the central London Crossrail sites a large amount of money was being spent monitoring. At each station many teams of surveyors were employed and the cost of on-going manual monitoring was high. Frustrated with the situation, client Crossrail turned to its engineers from Arup and Atkins to come up with a solution.
“From a project perspective, it was also imperative to avoid something untoward occurring because of failure to interpret the monitoring data,” says Arup associate director Mike Devriendt. “So we considered what could be developed to interpret the right data quickly to allow the decision making process to be more appropriate on site.
“We also considered whether there was a technical solution that could suggest where there was significant redundancy of monitoring data being acquired and where on-going monitoring costs could be reviewed. “
In the past, asset protection engineers on site would have to attend regular daily and weekly meetings and much of their time would be taken up reviewing large paper-based documents of graphs summarising the current monitoring results. The data would then have to be matched to records of construction progress which had taken place that day to try and piece together what had caused specific movements. With the instruments on each of the Crossrail sites giving out around a million data points each day, this was by no means an easy task.
“Manual correlation of the data to construction progress was challenging and time consuming,” says Devriendt.
The solution lay in teaming up with a data science company called Quantum Black. The company, which originally started off working in Formula 1 to help inform race strategies, specialised in taking large quantities of data, interpreting it and providing results to systems in a matter of seconds.
Together, a team of Arup, Atkins and Quantum Black developed a web-based application known as Adaptive Instrumentation and Monitoring (AIM). The program took the data from each site and, with some complex statistical analysis, produced visualisations of the information using data analytics to try to understand spatial and time dependent correlation between monitoring points.
“Quantum Black had no geotechnical or civil engineering background, so they had to team up with individuals in Arup and Atkins who would guide them and tell them what was useful,” explains Devriendt. “They have a strong understanding of statistics and statistical methods for which they can obtain correlations in data sets, but it relied upon engineering knowledge to subsequently interpret the significance of the outcomes.”
One of the ways in which the software was able to help the engineers on site was the combining of the construction progress with the movement data being produced.
“Existing monitoring software often show graphs about how movement is happening with time but they do not convey an understanding of what construction works are influencing the monitoring points, be that tunnel construction, piles being installed or excavation,” says Devriendt. “By collecting the construction progress data we are able to clearly identify in the AIM application zones of influence around a given construction activity.
Engineers on site were tasked with completing daily spreadsheets detailing the activities being carried out on site that day. This enabled the team to link the movements recorded to the events causing them. Although a seemingly simple step, it saved time and transformed the way in which the engineers were able to use the data.
The software now had three key benefits, the first of which was the ability to detect anomalies in the data being produced.
Using the “big data” analysis capabilities provided by Quantum Black, it was possible to link geographically close points together to produce contoured graphs of the movements. By doing so it was then possible to quickly identify rogue points which appeared to step out of line as either a genuine concern or as a result of something simply going wrong with the instrument.
“Imagine if you have three settlement studs in the street, anomalies in the data set could arise from re-baselining or from other extraneous influences resulting in the point being disturbed or destroyed,” explains Devriendt. “The analytics were continually trying to identify the spatial and time-dependent correlation between the monitoring points such that if you saw one of the points heading off southwards on the graph, then it would flag up to the user that something was amiss.”
The second of the benefits was being able to forecast the settlement produced by a particular activity which gave an earlier warning to the engineers of potential breaches of trigger values which could result in the costly stopping of work on site.
Finally the third benefit, and perhaps the one which offered the potential to transform the monitoring of sites in the future, was sampling.
Deciding the number of monitoring points required on site can be a very subjective process with third parties often wanting to increase the amount from the original tender says Devriendt. The team wanted to use the data from site to investigate the possibility that the number of readings taken could be reduced. Graphs of specific areas were produced showing contours using all available monitoring points. These were then compared to plots of contours derived from only half of the points. It was discovered that when the difference between the contours was consistently small, the possibility that a reduction in the number of points could be considered.
“We looked at a lot of different construction activities like sprayed concrete lined tunnel construction, excavations, installing piles or compensation grouting,” says Devriendt. “We’ve used the technique out in Singapore looking at the Metro construction on the Downtown Line 3, a project where the civil works are at a similar stage to Crossrail, and we saw similar correlations.”
“Clients like HS2 have taken a keen interest in this technology as there is a general perception that the amount of monitoring that was carried out on Crossrail was significant”
Mike Devriendt, Arup
Using the analytics as a tool to support their engineering judgement, the team found that they could suggest whether the distribution and amount of monitoring that was being carried out was either over or under what was required.
“What we are seeing from Singapore is that even though the ground conditions are different to London, there are similar levels of redundancy in the monitoring as to what has been observed in London,” says Devriendt.
“Clients like HS2 have taken a keen interest in this technology as there is a general perception that the amount of monitoring that was carried out on Crossrail was significant and for the next job there should be an opportunity to optimise what we do.”
Devriendt is clearly very enthusiastic about the potential of the software which has been developed.
“If we went back five years ago, it would not be possible to process millions of data points on a daily basis and to run the complex analytics through it, there just wasn’t the computing power.”