{"api_version": 1, "episode_id": "ep_uncanny_valley_wired_c41fee4bca9e", "title": "Silicon Valley Is Spending Millions to Stop One of Its Own", "podcast": "Uncanny Valley | WIRED", "podcast_slug": "uncanny_valley_wired", "category": "news", "publish_date": "2026-04-14T09:00:00+00:00", "audio_url": "https://pdrl.fm/e44d6f/www.podtrac.com/pts/redirect.mp3/pdst.fm/e/tracking.swap.fm/track/uJwtcKQUPuqBQPfusm59/dovetail.prxu.org/5901/84b36859-ee60-45e2-90a9-7fc364475a6b/WI_BigInterview_AlexBores_Mix1_260413__1_.mp3", "source_link": "https://play.prx.org/listen?ge=prx_5901_84b36859-ee60-45e2-90a9-7fc364475a6b&uf=https%3A%2F%2Fpublicfeeds.net%2Ff%2F5901%2Fgadget-lab", "cover_image_url": "https://f.prxu.org/5901/84b36859-ee60-45e2-90a9-7fc364475a6b/images/854f5ec1-220b-4e3e-ac65-2ffea6199d8b/Uncanny-Valley_The-Big-Interview_Podcast__2_.jpg", "summary": "Alex Borres, a former Palantir employee and current New York assemblyman, explains how data ontologies helped recover $20 billion from banks after the Great Recession by tracking flawed loans across securities. He details his resignation from Palantir over ethical objections to ICE contracts lacking guardrails during the Trump administration. Now running for Congress, he advocates for AI regulation and criticizes tools like Slack for disrupting focus, emphasizing policy over tech as the root solution to societal harms.", "key_takeaways": ["Data ontologies can expose systemic financial misconduct by structuring and tracking data objects like individual loans across complex systems.", "Ethical tech deployment requires contractual guardrails, especially in government contracts, to prevent misuse in areas like immigration enforcement.", "Effective policymaking can stem from technical expertise, but upstream legislative action is more impactful than downstream technological fixes."], "best_for": ["tech professionals considering public service", "policy makers interested in AI regulation", "listeners concerned about ethics in government tech contracts"], "why_listen": "The episode offers a rare firsthand account of ethical conflict within a major defense-tech contractor and how technical insight translates into legislative action on AI and data governance.", "verdict": "worth_your_time", "guests": [], "entities": {}, "quotes": [], "chapters": [], "overall_score": 76.0, "score_breakdown": {"clarity": 80.0, "originality": 78.0, "actionability": 65.0, "technical_depth": 72.0, "recency_relevance": 88.0, "information_density": 75.0}, "score_evidence": {"clarity": "Palantir helps organizations make use of data that they already have access to by making it easier to track changes to that data over time.", "originality": "You're always talking about how you are downstream of bad policy, trying to fix it with tech. Here's your chance to go upstream and design it right the first time.", "actionability": "We built a system that let you track individual loans, search for loans moving from tape to tape, and found numerous examples of that exact pattern.", "technical_depth": "An ontology is helping you do. And so we built a system that let you track individual loans, search for loans moving from tape to tape.", "recency_relevance": "The ads are funded by the pro AI pack Leading the Future, and they take direct aim at Boris' previous experience as a Palantir employee.", "information_density": "We found numerous examples of that exact pattern. Banks realizing there was a flaw, pulling it out of a security, and then sneaking it into another one later."}, "score_reasoning": {}, "scoring_confidence": 0.95, "transcript_available": true, "transcript_chars": 44996, "transcript_provider": "deepgram"}