Hospital Pharmacies Have a Key Role in Macro Health Improvement

Hello everyone – Happy Easter.

Just a quick post to note that our latest case study is live – we have been working to establish Ascribe On-Line Analytics as a cloud platform for non-patient identifiable data which provides insight into clinical and operational practice in Hospital Pharmacies, in the wider context of patient care, health and well-being.  Gawd that was a long sentence.  Clearly I have had too much chocolate.

But seriously folks…

Look out for phase two of this exciting project as we embed the clinical intelligence into mobile clinical workflow, for example to optimise clinical audit and provide point-of-care decision support.  Thanks very much to the team from Teesside to Redmond and all points in between for working on this.


Leave a comment

Filed under Uncategorized

Hadoop is growing up and what does that mean for us all?

Okay it’s time to talk about that little yellow elephant again.  Hasn’t he got big since this time last year when everyone began talking about him?  It’s time to look again at whether Hadoop is hype I guess and this morning the new Forrester report wafted into my in-tray.  Nice.  I might have contributed to the piece on Microsoft in it….I am not sure if I can possibly say – except that whoever gave them the feedback they quoted is surely the smartest most handsome young man in the world of data.  Not you then clicky, I hear you cry.

So, Hadoop: Get a load of data, break it into chunks, stick it in smaller pots, apply processing on the pots, bring the processed data back together and you have made the element “insight” from the element “data”.  If you find something else to do in the alchemy process, like apply natural language processing or predictive modelling, then you can really transform your business.  Clearly more technical explanations of what Hadoop does are available.

It’s the maturity of the Hadoop market that grabs me.  It isn’t fancy open source.  It is powering the data processing strategies of some pretty big platform vendors just now and they are competing with each-other in a market driven by the desire for faster processing speed at lower cost.  The open source rooted early installs of Hadoop have seen a proliferation of now business critical solutions which have been developed, supported and exploited by democratizing  in-house tech teams and boutiques.  The interesting thing then is how the platform vendors are using Hadoop, alongside their conventional offerings, to differentiate from each other.

And it all works.  If you look at the Forrester wave assessment all the vendors assessed are in the top right quadrant which means they have scalable solutions, proven and underpinned by a solid strategy to be even better.

Forrester recognises the value of Microsoft’s strategy which links cheap cloud storage, cheap cloud processing using HDInsight (their version of Hadoop) and on-premise analytics provided by SQL and Office.  You can put your SQL in the sky if you want to.  You can put your Office there too, I know.  However it’s the ability to have all these technologies integrated but using optional deployment models that is quite exciting to me.

I was working on a public sector project recently where the customer had a traditional warehouse / BI problem but was constrained by the nature of their business.  Like all public sector organisations in the UK operating at scale, the storage of persisting and growing volumes of data is neither tenable nor congruent with their business function or modus operandi.  Councils, for example, will do more commissioning and less direct service provision, which will be outsourced.  They will need intelligence to understand demand and contract to have it met.  They will need to manage contracts.  They will also need a diverse range of outcome measures, including sentiment analysis.  This is a big data world where the old monolithic data warehouse has a role but not the role.

We came up with a reference architecture for this world, using the Microsoft stack, which squared the circle.  Couldn’t have done that in the pre-Hadoop / pre-In-Memory analytics world.  Not just because of the technology but also because of the thinking.  The project in question has a mixed economy of quantitative and qualitative data sets, some flowing at pace, some barely changing from year to year and customers who don’t know what questions the want to ask.  The architecture caters for the storage of “just what we need”, data quality and master data services tied into retention, disposal and rights management strategies and then dimensional and tabular model powered analytics – with a tiny persisting footprint. 

And that’s the key innit?  Back to my small data point.  The purpose of analytics is to change a single persons mind about clinical or operational behaviour and to scale so you can meet that goal for a whole cohort of people.  The uptake of Hadoop technologies and thinking brings us a step closer to personalized analytics based on huge volumes as we climb down from our one-size-fits-all warehouses.  (writer makes huge trumpeting noise and flaps ears)

Leave a comment

Filed under Uncategorized

Is Power View the best analysis tool with the word “View” in its’ name?

Readers of the blog will know that historically there has been a debate over the value of reporting solutions versus an integrated BI solution stack. Sometimes reporting wins – with the launch of PowerView Microsoft has the missing jigsaw piece that addresses this issue and enables delivery of an integrated stack with all the benefits of the reporting tools with whom Microsoft compete.

Reporting solutions have become very popular, not least in the NHS market, for a couple of reasons; firstly, their ease of use and secondly, the commercial strategy used by vendors. As ever this is IMHO! They are easy to use because end-users can quickly and easily connect to a data source, ingest it into the reporting tool “in-memory” and produce highly visual outputs. Marry this to a commercial strategy that sees your reporting tool being the front-end of a market-leading, say, Finance and patient-level costing tool where the business users hold budgetary sway, and you can see how the solution becomes ubiquitous from a relatively recent standing-start.

There are however risks in having a BI strategy that centers on deployment of a reporting tool. The tools carry licensing costs, over and above the price of licensing a standard “knowledge worker” desktop. Although easy to get going with the tools, you hit a ceiling pretty early and often need training and specialist skills which can cost. The key selling feature of the tools is their ability to produce things that are visually appealing without managing the underlying data – you just connect to it. So you can produce pretty garbage, if you are not careful.

Microsoft’s story used to address this by arguing that an integrated technology stack which dealt with data warehousing (which reporting tools don’t), business intelligence (which reporting solutions do) and collaboration tools so that you can act on the intelligence you gain (which reporting solutions don’t) was King. Many of my friends and colleagues agreed. There are lots of instances of the Microsoft stack being used very well. However the market only partially bought into the Office / SharePoint / Reporting Services front end and has become increasingly beguiled by the likes of QlikView and Tableau.

Enter PowerView. PowerView is really a rebrand of Excel. On a feature / function, like for like comparison with reporting tools I would argue it wins. Let’s see what Garner says when the new Magic Quadrant comes out. The integrated stack story still stands. It has just been added to.
Using PowerQuery it outperforms reporting tools for ease of data ingestion from multiple sources including some unique HTML reading software that blows my socks off. Using PowerPivot it has immense and efficient, highly performing in-memory capability. Using PowerMaps it can geo-code data on the fly and produce the equivalent of GIS insight. Using Q&A it can process questions asked in English (like which drugs costs the most) and mine your data to give a dashboard style response. Using PowerView there are a range of visuals that can be pulled together using simple Pivot Table type wizards so you can produce cool things without learning new tools. The outputs come in HTML5 format so that you can use them across devices and operating systems.

I am tempted to say “beat that”.

Of course the commercial issue remains. The thing that gets great software embedded is proving the value and carrying the message through a commercial route where all players “win”. Cearly selling expensive and high margin reporting tools can be highly rewarding however more rewarding, for everyone concerned, is using the power of the tool to make compelling content that changes clinical and operational behaviour.

I am working with a well-respected (well…by me anyway….yeah, that’s you Philip!) on a project to trial PowerView on Surface Pro devices for clinicians. It’s been very interesting. We have been able to produce appealing views on trusted clinical data, augmented with open data and put it into the hands of mobile workers at a fraction of the cost (largely because it’s a fraction of the effort) of previous solutions. We will blog results when they come out later in the month.

Now, you’ll know Ascribe has been through changes. We were bought by EMIS. From an analytics perspective its been great. Using the above project, we have been able to pull in community data from a different company, within the EMIS group, and instantly demonstrate value. Not from throwing cool dots onto a groovy map but in linking data from different care settings and tracing patient behaviour to identify where service interventions can make a difference. Success is that you don’t notice the software.

And there my friends, is the difference between buying some software licenses for a reporting tool and working in partnership, across an integrated stack to produce compelling content efficiently in a format that can be consumed and is appealing.

Leave a comment

Filed under General BI

Small Data is More Important Than Big Data…but it’s the same?

I sat through a bunch of Big Data presentations a few months ago and it was a bit like watching a succession of Dr Evils from the Austin Powers films talking about how their new secret weapon could process four gazillion, billion terrabytes of data in seconds. Picture them putting their little fingers to their mouths. Then I slunk off to a side room and listened to the most interesting Big Data presentation that I have ever seen. It changed the way I thought about the opportunity created by Dr Evil and his impotence in the face of business need while he focussed on being the biggest, baddest data processor in the world. Brouhahahaha (that is how I spell villainous laughter).

The presentation was about the US election. They all were. What made this one great is that Obama’s team had worked out how to use the smart stuff to change the behaviour of single voters. They identified voters who were open to changing their mind about who they voted for, what issues they cared most about, what they wanted to hear from their politicians, where they went to get their information and then they pumped targeted messages out. This is a Big Data story insofar as the technology required to achieve this needs to handle the velocity, variety and volume of data that we all talk about.

But it’s actually a small data story.

Think about this in a clinical setting. The goal of BI is to identify opportunities to change clinical and operational behaviour. Making people vote to work on the safest, highest quality and most cost-effective way.
Think about it this way. This is about someone changing your behaviour and to do that they need to understand you, your role, your context, your challenges and then they need to talk to you. The problem with this challenge is that it’s hard to scale. So the process can make you good, but there are millions like you and one man / woman can’t talk to you all. It’s too hard. So, the boss shies away from addressing this issue and looks for the sort of knockout business cases with knockout RoI that are going to deliver service improvement.
But actually the benefits only come from one person changing the way they work and then scaling that – rather than from a national initiative that looks beguiling but isn’t going to address that issue. Obama did indeed carpet-bomb the public with information but the difference between him and Romney many well have been down to how he spoke to those key few and the prize for him was huge.

We have just finished a Big Data project with a Hospital in England, Microsoft and Intel – I am now going to start calling it a small data project. It used Ascribe’s Unscheduled Care software, the artist previously known as Symphony. Love that software.

The benefits we gained were huge and unexpected. We found lost revenue because the Hospital was not charging for the scale and complexity of its work – the scale and complexity being hidden in case notes. We found a high quality of service as the case notes showed that clinicians were offering drinks, support in going to the toilet, regularly checking in on patients, showing the family to the canteen – all kinds of things that you wouldn’t find in the radio buttons and drop down lists that we use to record activity. We found trends and patterns that were driving A&E attendance – actually driving – people were saying “This is why I came” rather than just noting the fact that they came – and from this we can work out how to direct services to address the causes of demand. We were able to link patterns in patient behaviour to data sets from social care and look at why care homes were generating more attendances than others. We looked at places where incidents were happening in the community and generating attendances at Hospital, rather than just where people live as there is a difference in these things when you think about commissioning services.

The list goes on. When you give small data to clinicians they become even more valuable in the design and delivery of care. They don’t care how much processing I had to do to give them the small data – they just care that I gave them something small, consumable and insightful. They care that the definition of insightful was their definition, based upon their desire to improve the lives of patients at the coalface. Using small data they do that.

(Mr Clicktastic has been on holiday)

Leave a comment

Filed under Emergency Care, General BI, Hospitals

Has Microsoft got too many BI products part deux – this time with POWER!

One of the things you don’t really think about when you are blogging is the longevity of the content. Blogging is great for immediate thoughts; however like all things in cyberspace it lives forever and loses its synchronisation with the reality at the time it was written. So I am coming back to the question of whether Microsoft has too many BI tools – the answer before was probably, but does it matter and the answer now is that the story is more coherent. Although a fresh set of challenges emerge.

Here’s the new mantra, BI is Office, Office is 365 so everything is cloud first, the world is always on-line and the enterprise vision of BI backing into warehouses / in-memory data stores / hadoop clusters is as strong as ever. Simples!

All the BI tools now start with the word Power (except Reporting Services which is the nearest our market has to a ubiquitous BI platform) and this branding has helped solidify the vision of an integrated BI suite of tools that allow the user flexibility to choose the best tool for the job. Which is pretty how much how we articulated the story previously but it’s nice to have that support.

It’s also nice to have a multi-platform and cloud BI story, although this can only be considered nascent at best because PowerBI with its new HTML5 (as I understand it) coat of paint doesn’t provide me with as rich a “feature / function” story as some of my competitors. Although, like all things, it will get there pretty soon I am sure. Contrast that with Microsoft’s consumer story where applications run beautifully through the continuum of phone to slate to PC to TV and you know that R2 PowerBI is going to be consistent with this vision and therefore is going to be fantastic.

The more interesting developments, for me, are taking place under the hood. The work Micrsoft are doing to beef up the ability of end-users to build ever more sophisticated algorithms in Excel is amazing and readers of the blog will know how we are using HDINsight to tackle big (or should that be Big) unstructured data (Data??).
The challenge is the cloud-first angle. The reality is that Microsoft needs to improve its cloud story and so do we, as we look to realise the benefits of cloud for our customers. But the reality is that data sovereignty is still an issue for IG (although the patient-level versus patient-identifiable debate is interesting and is moving the agenda along), hosting is an issue for businesses that have entrenched beliefs and resources committed to on-premise solutions and moving to the cloud is a job of work that is not in workplans and strategies that were agreed a few years ago and have a few years to run. So moving people’s mindsets is not going to happen overnight. And having cool BI is not sufficient to move them – they need to buy into cloud as an enterprise, which makes the decision more difficult to make and then execute.

So, it’s good that the story is getting more coherent and also widening to encompass devices and services in-line with the Microsoft strategy for all things. It’s good that the cloud story is gaining traction and increasing the options for our customers as they look to be productive and save money. I am optimistic about the future. I look forward to getting more immersed in the all-things-Power product set and aligning it to the needs of the Health and Social Care market whilst being realistic about the challenges ahead.

Leave a comment

Filed under General BI

Ascribe at Microsoft’s World Partner Conference

Fact is that whatever Microsoft are talking about today we will be doing tomorrow – even if tomorrow takes more than 24 hours to come around – and so WPC (the Worldwide Partner Conference) is always a great barometer for me. I have just packed for home and am thinking about how it was this year.

It was very, very good.

Some years the messaging is simple because there is a major product launch. This was the case with Windows8 last year. Sometimes, like this year, there is a theme which is more complex. Firstly, there is the strapline about “Accelerating growth together” and the mood is one of optimism created by this theme. Contrast with “Do more with less” which was a previous mantra. The global economy is looking up. Come on let’s get enthusiastic about economic recovery. Let’s not be miserable. Microsoft are going to help us.

I’m not being cynical. I genuinely believe that talk of depression is a self-fulfilling prophecy and a lot can be achieved by thinking in a more upbeat way. Just reading that back, you can tell I have been in Texas all week……

So, the way we are going to improve our lot is to think about 4 things; big data, enterprise social, devices and mobility and errr – there is always – cue fanfare – CLOUD.

Let’s get it out the way. I like cloud. I like the revenue model, pay-as-you-go, process data at scale story. However it’s been around for a bit without really lighting up, hasn’t it? I can’t say too much now but we have been working on a project that really makes Cloud, in particular Azure, relevant for the NHS. Tell you about it soon. Cloud might just deliver its promise to us in the UK Public Sector imminently.

Anyway, back to optimism around new paradigms.

I am bound to talk about Big Data aren’t I. I was part of a session run by the SQL server team promoting best practice in big data management. Readers of the blog will know that we in Ascribe have been working in this space for a year or so and have achieved some interesting results with our partners and friends at Two10degrees (the Kings of Azure).

We were both chuffed to our mint-balls to hear this work called out in Monday’s vision keynotes. It was a real privilege to be mentioned as part of the speech that sets the tone for Microsoft and their partners’ strategy for next year. As we beaver away in our potting sheds in Bolton we don’t often realise that what we are doing is actually quite leading edge.

In fact we have had a few name checks over the last couple of days, which seem to be on the back of us having delivered a good business story – we have created relevant software for the cloud.

By the time you read this, news of the Microsoft reorganisation will be out.

Many partners will be nervous about the company now being a devices and services outfit – particularly if they manufacture devices. We saw a lot of devices this week and we saw the same experience being used across all of them, powered by user profiles and preferences that live in the Cloud and are delivered down to the device. For consumers this will be very interesting – the same experience on your phone / slate / pc but tweaked, for example size of images, so it works. At work this will be similarly powerful, I think.

If you walk behind a clinician for a day and watch how they consume data you get a core set of requirements for complete and usable data about their patients, in a variety of settings, and a frustration that the data is dispersed across lots of formats. That is really inefficient.

Final word is on PowerBI, which got a lot of people very excited because it delivers some amazing features and functions. The most eye-catching is the Q&A tool where a user can type in a question, like how many patients did I see last month, and the search mines the Trust’s data sources and delivers back a chart rather than a page of search hits which contain the data somewhere. This is a great example of Microsoft converging products, in this case Bing search, to the infinite good. I’ll be demoing this as soon as its available. Gimme a ring if you want to see it.

Leave a comment

Filed under General BI

The ethics of big data – time to start the debate. Update from Friday’s BDA conference

Got up at 5.30, travelled to London to watch an IBM presentation on a telly in a hotel lobby. I’m at the BDA Big Data conference and it is standing room only. My mate Chris would love it. A room full of hammers looking for nails – the hammers are nicer than last year’s and are ever diversifying but they still ain’t found them nails.

The IBM session was fantastic actually. Readers will know of my allegiance to Microsoft but I have to say that IBM have found their nails and worked back from there, to the technology, which is absolutely the right thing to do. I heard a phrase “the segment of one” which I loved. Oh come on….put your cynicism to one side and think about it. The idea that big data starts small has been around for a bit but this is the first piece of solution positioning that has thought about influencing a relationship between the x organisation and the y citizen (singular), using big data. Well, first time from anybody else.

My goal, and this is directly aligned with Ascribe’s I know, is to improve the health and well-being of individuals in society and therefore improve society. This is the antithesis of conventional BI which focusses on how many cost how much and busted which target. That is why I know where the nails are – I care about, and therefore have made the effort to understand my domain.

So the hammers and nails will align eventually and IBM are clearly leading the way on this, if today is anything to go by. However there was an elephant in the room (and I don’t mean that bloody Hadoop elephant) which was introduced by David Davies, the Master of Ceremonies, who asked about public views on use of Big Data. This is the information governance debate, in its operational sense, and the ethics debate when abstracted to the philosophical.

What do people think about the ethics of Big Data?

On the one hand the data is known and volunteered, albeit subliminally potentially, the “use for the greater good” theory is hard to argue with as is the “nothing to fear if you are doing anything wrong”. I heard a car analogy today. Several billions of lines of code in a Ford Focus and [insert your own joke]. So it’s a mobile data centre. Maybe, it can auto-call 999 if you have an accident and are out cold, it can reduce your insurance premiums by monitoring your driving and rewarding safe practice and it can tell you how to extend it’s life and efficiency. Great. But not if you don’t care about driving your company car at 85mph routinely and don’t want it calling 999 if you have an accident being somewhere other than where you were meant to be.

Should we let this spoil it for the rest of us – he said in a way that would incite an angry mob?

On the other hand, my medical records are my business and I’ll decide who sees them for what purpose thanks. First of all, those who have access to them currently should use them – before I put them to secondary use. Secondly, I need to be convinced they will be used ethically – not that I can define ethics or know where to go to have the dialogue. And that’s the problem. Do I trust my champions to resolve the ethics issue on behalf of me, as a citizen / patient?

So when I think about Big Data and I think about ethics I want to start with the first point and this is somewhere I believe that value can be, and is slowly (through my ‘umble work) being, realised. Consider the work being done in Yorkshire just now, by us / Microsoft and an NHS partner, where we are processing text through a natural language processing engine and producing abstract patient records which will make it easier for clinicians to quickly consume an overview of the patient and drill directly into the most relevant areas of case history. The IG team at site are working very closely with us to help deliver this because it is relevant, well-governed, doesn’t compromise the patient and has an easy to articulate benefit. As a patient I want you to know everything about me so you give me the best care. I don’t want you ignoring my case history because it is inaccessible in the time and environment our relationship operates within.

There are books on the ethics of Big Data – some work needs to be done to ground the theory into reality or the hammers and nails will never do their job. We have a start but there is a long way to go before the thinking matches the potential of the technology.


Filed under General BI