Interesting read. I would be concerned with data protection and privacy, especially how this data is transmitted and stored in the context of the caregiver and their association. Also, I would interested to see this as a progression from something like smartwatch functionality and health stats to gradually move into this space.
Ideally there would be a way to incorporate this tech and the ML into the current series of wearables for younger generations, which would hopefully slow down the need for this type of palliative care.
To JP’s point above, I think there is a different dilemma that stems from those who can afford (technologically and monetarily) to farm out their killing to robots. If one side possesses AI driven autonomous robots and the other doesn’t, who bears the ethical and legal risk of civilian deaths and collateral damage? Who ultimately makes the decision whether or not to fire into the crowd of people, or the religious site that the enemy is using as cover and concealment? I agree that regulation makes the most sense, but how do we enforce it?
This is one of the more interesting applications of machine learning that I have read about. Their low default rates seem to indicate they are doing well, but is it possible to alter your phone usage in a way that ultimately makes you seem more creditworthy? At the individual level, this seems like it would not be a threat, but at scale that could cause significant problems that would ultimately see how quickly the machines can learn.
To your last point about affecting real change in low-income areas, does this need to show some sort of long term change? Or does this just need to disrupt payday lenders and reduce the negative effects of predatory lending in underserved areas?
It seems like Sony is casting a much wider net than before, now aiming to capture the ideas of the startup market. How far are they willing to stray from their core competencies in order to pursue the next “big thing?” Fundamentally though, this means that Sony needs to pick winners from the beginning and fund them, since they can’t fund everyone. This seems to not be their strong suit – how do they overcome their own weakness at picking key trends? Maybe it ties into their much needed diversity, and encouraging a more diverse background in work experience and talent as well as the factors you already mentioned.
The author raised an interesting point about equity and equitability when it comes to making money from cancer treatment. While pathologists are certainly responsible for major breakthroughs in cancer care, they are compensated for their treatment of patients. MSK’s partnership with Paige.AI is the result of for-profit healthcare, which is the same system that compensated doctors at different rates at different hospitals for different jobs; this is just an extension of that. I think that anything less than that threatens to undermine the important innovations that are currently being made by aggregating large data sets and applying machine learning.
Interesting read. At what point does this scouting extend to minor leagues, college, and high school? At what point do you need to people to go to games and scout talent to feed into the machine, vs. deploying technology as a sensor to crowd source this?
I’ll be interested to see how their models and metrics shift if baseball continues to impose time limits between pitches or shortens the game in an effort to keep more fans engaged.