Search

Entries in Data & Technology (78)

Friday
Jul192019

Overconfident Healthcare Organizations? Could Be According to Healthcare Cybersecurity Survey 

By Clive Riddle, July 19, 2019

LexisNexis Risk Solutions in collaboration with Information Security Media Group has released results from their recent survey of hospitals, medical groups and payers, in their new 18-page report The State of Patient Identity Management, which found 50% are confident they have the necessary controls in place to prevent unauthorized access to patient information, 58% believe their portal cybersecurity is above average (and only 6% feel they are below average), yet 35% don’t deploy multifactor authentication.

To digress, some insight into those results can be gained from reading last week’s mcolblog post by Kim Bellard on Our Dinning-Kruger Healthcare System, which discusses the Dunning Kruger effect involving “the cognitive bias that leads people to overestimate their knowledge or expertise,” illustrated in the world of NPR’s Lake Wobegon – where “all the children are above average.” 

88% of the organizations surveyed had patient/member portals, and 93% use username and password as the patient portal authentication method. 65% deploy multifactor authentication, with 39% using a knowledge-based Q&A for verification, 38% using email verification, and 13% deploy device identification. 65% report that their individual state budgets for patient identity management will not increase in 2019.

Here’s the top three cybersecurity takeaways of the report according to LexisNexis:

  1. Traditional authentication methods are insufficient: As a result of many healthcare data breaches, hackers have access to legitimate credentials; users are also easily phished. Therefore, traditional username and password verification are considered an entry point, not a barrier, and alone cannot be relied upon to provide a confident level of security.
  2. Multifactor authentication should be considered a baseline best practice: HCOs should rely on a variety of controls, ranging from knowledge-based questions and verified one-time passwords to device analytics and biometrics to authenticate users based on the riskiness of the transaction. The more risky the access request is, the more stringent the authentication technique should be.
  3. The balance between optimizing the user experience and protecting the data must be achieved in an effective cybersecurity strategy: HCOs need to make it easy for patients and partners to access records while ensuring adequate data protection. To do this, an HCO's cybersecurity strategy should layer low to no-friction identity checks up front, making it easier for the right users to get through and layer more friction-producing identity checks on the back end that only users noted as suspicious would complete.
Tuesday
Jun252019

Growing where you are already planted

Kristin Rodriguez, Health Plan Alliance, June 24, 2019

 

Dan Michelson, CEO of Strata Decision believes integrated delivery systems need to shift their focus from buying and building hospitals to leveraging their existing platforms to generate growth and often more profitable streams of revenue.  This creates the ability for delivery systems to become a hub for health and healthcare in the future.

What needs to occur for this transformation to take place? 

In an article published in Becker's Hospital Review recapping the 2019 JP Morgan Healthcare Conference, Michelson outlines six ideas for integrated delivery systems to get started on leveraging their existing platform.  Each represents a formidable challenge, but if we work together and take advantage of resources available to us, we can start moving in the right direction.

  1. Embrace the digital front door: Healthcare providers have long excelled at building relationships and trust once consumers walk into our hospitals and clinics.  We need to harness the ability to create that same meaningful relationship, without limiting ourselves to a physical location.  

    Benjamin Isgur, Health Research Institute Leader at PwC Health Research Institute, echoes this advice.  At the recent Alliance Spring Leadership Forum, he underscored that Private Equity investors are particularly interested in anything that gets closer to the consumer.  And consumers themselves are eager for a new era of care delivery, with new venues and new menu options.  If we don't offer consumers a more convenient alternative, someone else will
  2. Get serious about affordability: This isn’t just about transparency or about reallocating resources more thoughtfully.  It is bigger than combining clinical and financial data.  The healthcare cost problem is huge and policymakers, drug manufacturers, insurers, delivery providers and consumers all play a role.

    It's important we avoid the danger that is “everyone’s problem” becoming “no one’s problem.”  As part of integrated systems, Alliance members are particularly well positioned to get a strong line of sight on this challenge.
  3. Don't just provide, prevent: Michelson points to the “strong strategic rationale associated with taking on a broader role of driving health versus only providing healthcare” in the communities we serve.  Policymakers understand this too; Medicaid and Medicare Advantage plans are encouraged more and more toward payment models and benefit design approaches that take on more than just clinical care.

    Just a few themes in the government-sponsored care space include VBIDtelehealthbenefit flexibility, and behavioral health integration, all of which present unique opportunities to leverage the network, venues of care, community partnerships, and more to reimagine the system’s role in the local healthcare landscape.
  4. Partner to innovate, or miss out: Becoming a hub isn’t just about the digital front door or food farmacies. It also means creating a space for innovators to gather, where research and education can occur so that ongoing evolution becomes a core competency of the system.
  5. Target chronic conditions and specific services: This builds on the center of excellence model in profound ways.  Systems that craft a powerful experience for specific chronic conditions or targeted services stand a better chance of maintaining a relationship with those consumers.  Michelson notes that this is both an opportunity and a threat for integrated systems, as we compete more and more with new platforms gaining competency in serving chronic conditions, like those of CVS Health and Walgreens
  6. Don’t just aggregate data – use it:  An Alliance member and informatics leader said that he envisions the day when his informatics teams can stop being data archeologists and can instead be data analysts.  The truth is that integrated systems are still solidifying their competency as data aggregators.  But it’s not enough.  It’s time to turn our attention to applied analytics: practical data sets that provide decision support so that we can gain better insights and pivot our platforms even faster.

With payers big and small across the country, the Alliance member network is a veritable think tank for executives wrestling these questions and challenges.  Join us and work elbow to elbow with your peers at our upcoming events designed exclusively for Alliance organization leaders. You can also meet and hear from Dan Michelson at the Fall Leadership Forum 2019.

Thursday
May232019

The Health Tech Our Toddlers Should Never Know

by Kim Bellard, May 23, 2019

Joanna Stern wrote a fun article for The Wall Street Journal: "The Tech My Toddler Will Never Know: Six Gadgets Headed for the Graveyard."  My immediate thought was about health tech's equivalent list.  There certainly is a lot of health tech that should be headed to the graveyard, but, knowing healthcare's propensity to hang on to its technology way too long, I had to modify her more optimistic headline to say "should" instead of "will."

One can always hope.  Here's my healthcare tech list:

1.  Faxes:  You knew it had to be at the top of the list.  Anyone under thirty who knows how to work a fax machine probably works in healthcare.  The reason faxes persist is because they supposedly offer some security advantages, but one suspects inertia plays at least a big a role. There are other options that can be equally "secure," while making the information digital. 

2.  Phone Trees:  We've all had to call healthcare organizations -- doctors' offices, testing facility, health plans, etc.  Most times, you first have to navigate a series of prompts to help specify why you are calling, presumably to get you closer to the right person.  There are probably studies that show it saves money for the companies that use them, and perhaps some that even claim its saves customers time, but this is not a technology most people like. By 2030 I want my AI -- Alexa, Siri, etc. -- to deal directly with the companies' AI to spare me from phone trees. 

3.  Multiple health records: I have at least five distinct health records that I know of, only two of which communicate to the other at all.  For people with more doctors and/or more complex health issues, I'm sure the situation is even worse.  EHRs are old technology, the cable of healthcare.  By 2030, we should each have a single health record that reflects the broad range of our health.

4.  Stethoscopes:   You've seen them. Your doctor probably has one.  Find the oldest photographs of doctors that exist and you might find them with stethoscopes; they are that old.” It's not that they are useless, but as it is that there are better alternatives, such as handheld ultrasounds or even smartphone apps.  For Pete's sake, people are working on real-life tricorders.   By 2030, seriously, can we be using its 21st century alternatives?  

5.  Endoscopes: Perhaps you've had a colonoscopy or other endoscopic procedure; not much fun, right?  We do a lot of them, they cost a lot of money (at least, in the U.S.), and they involve some impressive technology, but they're outdated. By 2030, we should be using things like ingestible pill cameras, with ingestible robots to take any needed samples or even conduct any microsurgery. 

6.  Chemotherapy: Chemotherapy is literally a lifesaver for many cancer patients, and a life-extender for many others.  We're constantly getting new breakthroughs in it, allowing more remissions or more months of life.  But it can pose a terrible burden -- physically, emotionally, and financially -- on the people getting it.  Chemotherapy has been likened to carpet bombing, with significant collateral damage.  Increasingly, there are alternatives that are more like "smart bombing" -- precision strikes that target only cancer cells, either killing or inhibiting them.  By 2030, perhaps cancer patients won't fear the treatments almost as much as the cancer.

Healthcare certainly has no shortage of technology that we should hope today's toddlers will never have to use or experience.  The above are just six suggestions, and you may have your own examples.  We can make these happen, by 2030; the question is, will we?

This post is an abridged version of the posting in Kim Bellard’s blogsite. Click here to read the full posting.

Thursday
Apr252019

Robots Need DNA Too

by Kim Bellard, April 22, 2019 

DNA, it seems, never ceases to amaze. Now scientists are using it to create new kinds of "lifelike" mechanisms.   Pandora, we may have found your box. 

Researchers from Cornell recently reported on their advances.  They used something called DASH -- DNA-based Assembly and Synthesis of Hierarchical -- to create "a DNA material with capabilities of metabolism, in addition to self-assembly and organization – three key traits of life."

That sends chills up my spine, and not necessarily in a good way. 

Lead author Shogo Hamada 
elaborated:

The designs are still primitive, but they showed a new route to create dynamic machines from biomolecules. We are at a first step of building lifelike robots by artificial metabolism.  Even from a simple design, we were able to create sophisticated behaviors like racing. Artificial metabolism could open a new frontier in robotics.

The reference to racing in his quote refers to the fact their mechanisms were capable to motion -- likened to how slime mold moves -- and they literally had their "lifelike materials" racing each other.  If I'm reading the research paper correctly, the mechanisms were even capable of hindering their competitor."

Well, that's lifelike, all right.

It wasn't all days at the race track; oh-by-the-way, they also demonstrated its potential for pathogen detection, which sounds like it could prove pretty useful.

These mechanisms eat, grow, move, replicate, evolve,and die.  Dr. Luo 
says: "More excitingly, the use of DNA gives the whole system a self-evolutionary possibility.  That is huge."  Dr. Hamada adds: "Ultimately, the system may lead to lifelike self-reproducing machines."

Those chills are back.

There has been a lot of attention on engineering advances that will allow for nanobots, including uses with our bodies and so-called "soft robots," but we should be given equal attention to what is called synthetic biology.

Synthetic biology isn't necessarily or even predominately about creating new kinds of biology, as the researchers at Cornell are doing, but reprogramming existing forms of life. They're being programmed to eat CO2 (thus helping with global warming), help with recyclingget rid of toxic wastes, even make medicines

A Columbia researcher 
believes that new techniques for programming bacteria, for example, "will help us personalize medical treatments by creating a patient’s cancer in a dish, and rapidly identify the best therapy for the specific individual."

In the not-too-distant future, we're going to be programming lifeforms and "lifelike materials" to do our bidding at the molecular or cellular level.  We've been debating and worrying about when A.I. might become truly intelligent, even self-aware, but the Cornell research is giving us something equally profound to debate: how to draw the line between "life" and "things"?


Medicine, healthcare, and health are going to have to develop more 21st century versions.  What we've been doing will look like brute force, human-centric approaches.  Synthetic biology and molecular engineering open up new and exciting possibilities, and some of those possibilities will upend the status quo in healthcare in ways we can barely even imagine now.  


It's not going to be enough to think of new approaches.  We're going to have to find new ways to even think about those new approaches.  

  
In the meantime, let's go watch some DASH dashes!

 

This post is an abridged version of the posting in Kim Bellard’s blogsite. Click here to read the full posting. 

Wednesday
Jan232019

Do Unto Robots As You…

by Kim Bellard, January 23, 2019

It was very clever of The New York Times to feature two diametrically different perspectives on robots on the same day: Do You Take This Robot and Why Do We Hurt Robots? They help illustrate that, as robots become more like humans in their abilities and even appearance, we’re capable of treating them just as well, and as badly, as we do each other. 

We’re going to have robots in our healthcare system (Global Market Insights forecasts assistive healthcare robots could be a $1.2b market by 2024), in our workplaces, and in our homes. How to treat them is something we’re going to have to figure out. 

Written by Alex Williams, Do You Take This Robot… focuses on people actually falling in love with (or at least prefering to be involved with) robots. The term for it is “digisexual.” 

As Professor Neil McArthur, who studies such things, explained toDiscover last year: We use the term ‘digisexuals’ to describe people who, mostly as a result of these more intense and immersive new technologies, come to prefer sexual experiences that use them, who don’t necessarily feel the need to involve a human partner, and who define their sexual identity in terms of their use of these technologies.

And it’s not just about sex. There are a number of companion robots available or in the pipeline, such as: 

  • Ubtech’s Walker. The company describes it as: “Walker is your agile smart companion — an intelligent, bipedal humanoid robot that aims to one day be an indispensable part of your family.”
  • Washington State University’s more prosaically named Robot Activity Support System (RAS), aimed at helping people age in place.
  • Toyota’s T-HR3, part of Toyota’s drive to put a robot in every home, which sounds like Bill Gates’ 1980’s vision for PCs. 
  • Intuition Robot’s “social robot” ElliQ. 
  • A number of cute robot pets., such as Zoetic’s Kiki or Sony’s Aibo.

All that sounds very helpful, so why, as Jonah Engel Bromwich describes in Why Do We Hurt Robots?, do we have situations like: A hitchhiking robot was beheaded in Philadelphia. A security robot was punched to the ground in Silicon Valley. Another security bot, in San Francisco, was covered in a tarp and smeared with barbecue sauce…In a mall in Osaka, Japan, three boys beat a humanoid robot with all their strength. In Moscow, a man attacked a teaching robot named Alantim with a baseball bat, kicking it to the ground, while the robot pleaded for help.

 

Cognitive psychologist Agnieszka Wykowska told Mr. Bromwich that we hurt robots in much the same way we hurt each other. She noted: “So you probably very easily engage in this psychological mechanism of social ostracism because it’s an out-group member. That’s something to discuss: the dehumanization of robots even though they’re not humans.”

 

Robots have already gotten married, been granted citizenship, and may be granted civil rights sooner than we expect. If corporations can be “people,” we better expect that robots will be as well.

 

We seem to think of robots as necessarily obeying Asimov’s Three Laws of Robotics, designed to ensure that robots could cause no harm to humans, but we often forget that even in the Asimov universe in which the laws applied, humans weren’t always “safe” from robots. More importantly, that was a fictional universe.

 

In our universe, though, self-driving cars can kill people, factory robots can spray people with bear repellent, and robots can learn to defend themselves. So if we think we can treat robots however we like, we may find ourselves on the other end of that same treatment.

 

Increasingly, our health is going to depend on how well robots (and other AI) treat us. It would be nice (and, not to mention, in our best interests) if we could treat them at least considerately in return.

 

This post is an abridged version of the posting in Kim Bellard’s blogsite. Click here to read the full posting