Navigated to From NASA’s X-59 to Humanoid Workers: The Future Is Getting Weird - Transcript

From NASA’s X-59 to Humanoid Workers: The Future Is Getting Weird

Episode Transcript

Speaker 0

Welcome to the Geek News Central podcast, episode eighteen fifty two.

This is your host, Ray Cochran, speaking tonight.

It has been a little minute.

So sorry for the delay on the shows, but welcome.

If it's your first time here, welcome to the show.

And for all you returnees, welcome back.

Got a great show lined up for you folks this evening.

This afternoon for me.

I don't know why I said evening.

Just, But let's, kick it off with a really awesome sweet first article.

NASA's ultra quiet supersonic flying swordfish makes history with the first test flight.

NASA and Lockheed Martin have spent years developing the x fifty nine, a long nose experimental aircraft designed to quiet the traditional sonic boom into a softer thumb.

This breakthrough is key to potentially lifting the decades old ban on supersonic flight over land.

The aircraft just completed its first real test flight out of Palmdale, California on October twenty eighth.

So this is a pretty cool story on some, aviation innovation that that we haven't really covered too much on the show, I feel like, in recent times.

So really, really awesome step forward in the aviation field.

NASA has officially flown its x fifty nine flying swordfish for the first time, marking a major step towards the return of commercial supersonic air travel.

The aircraft didn't go supersonic yet, but upcoming tests will push towards Mach speeds as NASA prepares to gather noise impact data from communities across the country.

This is kinda cool.

I mean, I would be interested to just kinda see what might change for commercial aircraft, or commercial aircraft travel times, with with technology like this.

The aircraft flew for about an hour at twelve thousand feet, and they didn't test any, sound barrier breaking, on this test flight.

It seems it was just, mostly for making sure all the devices were working, configurations were correct, fine tuning.

I do believe they're planning to do some testing.

It says here in Oklahoma, but really awesome specs.

According to Lockheed Martin specifications, the x fifty nine has a top speed of Mach one point four or nine hundred twenty five miles per hour or fourteen eighty nine kilometers per hour, which is almost twice as fast as a Boeing seven forty seven.

It's, designed to fly at an altitude of fifty five thousand, feet, and the aircraft has a wingspan of thirty feet.

It's fourteen feet high and a whopping a hundred feet long, giving it a strong resemblance to a swordfish.

Let me show a front view picture of it here on the article that I have.

It's just cool.

I do believe that there's probably some other models like this out there.

Someone in the comments of this article says, boom.

It's done something similar.

Something to look into if you're interested in supersonic flight.

Let's do some let's step away from the article for a second.

Welcome back, everyone.

Sorry for the late delivery on this week's episodes, plural.

Chris and I have been pretty backed up.

Chris just picked up a new job, and I've been just plunging through mid or plowing through midterm season.

We're actually coming up on the end of my term here, in a few weeks, and, man, I don't feel like I'm ready at all.

I think, I got one of my midterms back.

It's probably one of the lowest scores I ever received on a midterm in the last few years, so it's definitely really disappointed in myself in terms of, performance there.

But hoping to turn that around and potentially, do better on the final.

So I've been studying a little bit extra, working a little bit more on that, to try to keep that.

I'm trying to compensate for how poorly I did the first time around.

It's very clear I didn't understand the content.

But, I wanna take a second to thank our amazing sponsor, GoDaddy, for their incredible support of this podcast.

They make it possible for us to keep doing what we love and for you to enjoy hearing every week.

That's that's a little words.

If you're ready to start your own website or even launch a podcast, GoDaddy has you covered at geek new central dot com forward slash godaddy.

You'll find special promo codes that make it easy and affordable to get started.

Economy hosting, just six ninety nine a month.

It's a full year of hosting with a free domain name, professional email, and an SSL certificate.

WordPress hosting, only twelve ninety nine a month, including that same free domain name, email, and SSL certificate.

And if you just need a domain name, you can grab one today for eleven ninety nine.

And if you're not super sick to tech savvy, like, GoDaddy's website builder gives you a free thirty day trial on personal business and business plus plans, so you can get online in no time.

Let it build the website for you.

We use GoDaddy, products and services here at Geek News Central, and I can definitely tell you their reliability is the real deal.

I don't know if I've ever experienced a website outage personally through GoDaddy.

But your support of GoDaddy directly supports this show, so please use our codes, click and save, and help this independent podcast going strong.

Share those links with friends and family every time one's used.

It's like writing a check to the show.

Kinda weird my insider portion got cut out here.

But if you guys do wanna support the show directly yourself, we do also have a insider program, the at geek news central dot com slash insider.

Support the show directly today through the insider program.

We really appreciate those of you that do support the show through this program.

Not everyone needs a website right away, so sometimes you can, just contribute what little you can to the show.

We operate on a value for value model.

So if you're not using, modern podcast apps to, stream sats or boost the show, we'd definitely recommend trying that out.

Check it out at podcast apps dot com.

I've been recently using Cast O Matic, trying that out to see how it, how their apps fleshed out and stuff.

So, definitely something to check out.

We truly thank you guys for supporting us, and a huge, huge thanks to GoDaddy for believing in independent creators like us here at Geek News Central.

Let's get back into the show.

I got another cool article for you guys tonight.

The Phantom transparent four k monitor.

Virtual Instruments has built a prototype twenty four inch transparent four k monitor called the Phantom.

Using a teleprompter style mirror system rather than a transparent OLED, it achieves a stunning five thousand nits of brightness and allows users to see their environment through the screen.

Only ten units will be produced initially.

Ten units.

Give me a break.

So this new prototype display called the Phantom is aiming to re reinvent the look of a desktop monitor by making it transparent.

Despite being a concept level device with a limited production, it demonstrates where high end display technology may be headed with mixed reality style visuals.

So parent apparently, only ten are being made, which is, pretty disappointing, honestly.

What the heck?

I I wish I guess it's just a start up, but some of the images on this article make it look really cool.

Obviously, the backgrounds look fake, but definitely looks a little bit niche.

But I think it's cool to just kinda see this starting to, come into play, especially with themes like Google Glass coming, coming out that came out this year.

Seems like transparency and opacity is very in.

Even my own, window manager, I've I've set, opacity on, my inactive windows to be a little bit more clear so that, you know, you kinda see the background through it, and it looks like a glass screen.

It's it's cool.

I don't know.

I I get it.

It's cool.

But I think this is awesome.

If you guys wanna check this out, this is a cool article from Tom's Hardware.

Thank you, Hassam Nasir.

And I didn't, give some credit to the last article.

So the the previous article is from Live Science, by Damien Pine.

Thank you very much, Damien Pine.

But, yeah, I mean, I guess since even though we won't see transparent monitors in stores yet, this kind of tech is signaling the future of workspace screens that blend into your environment.

Yeah.

Right?

And then AR displays, I I've recently have kind of been more and more of a believer of AR, and I think my next article will actually kinda convince you of that.

Now this this next one is is still in the realm of, this one's in the realm of AI.

This is, World Labs marble three d world model platform got released.

If you guys, don't know.

So Fei Fei Li's startup, World Labs, has released marble, a next generation AI model that can generate full three d worlds from text, images, video, or rough layout sketches.

It's part of a major industry shift from flat image generation to spatial intelligence in three d environments.

And I was talking about this last night with my partner, and it it's really interesting to kinda see how this technology is shifting.

There was a generative, like, AI generated Minecraft game where it would generate the scene based on, like, your relative scene and and other scenes it's seen in the game.

So you could walk into a dark corner and and back out of the corner and actually be in a cave now because the context had changed to a degree where it was closer to being in the cave rather than where you originally were.

So instead of having it like that where they're generating it live, now it's being generated with, more of a, how do you put it?

Like, spatial context.

You get more spatial context, spatial awareness.

If there's a building across the way that has, mushrooms all around it, then the new building you're gonna make is gonna be, mushroom themed as well.

If you guys don't know who Fei Fei Li is, Fei Fei Li is, definitely one of the most influential figures in artificial intelligence today and and has done a lot of work in, visual computing.

They were born in Beijing in nineteen seventy six and educated here in the US.

She had her her BA at Princeton, her PhD in electrical engineering at Caltech.

And at Stanford, she's been a professor of computer science, founding co director of the Human Centered AI Institute, and former director of Stanford's AI Lab.

Now I think her biggest notable achievement is ImageNet.

Anyone that does visual computing work will know what ImageNet is.

It's a large scale image database that unlocked, basically, like, modern era of deep learning and computer vision.

So more recently, she's been working on this world labs, to pioneer spatial intelligence AI that can perceive reason and act within three-dimensional physical environments.

Because she's a huge ad advocate for human centered and ethical AI and a greater diversity in the field of AI in general.

But, yeah, Fei Fei Li is huge right now.

I would definitely look into her and follow her on all the relevant social media if you're really into, AI news and and visual computing.

But let's get back to the article.

World Labs is launching Marvel, a platform that lets creators generate three d scenes in environments with simple prompts.

It can output meshes, videos, or Gaussian splats, and it's designed to plug into gaming engines, robotics simulations, and VFX pipelines.

It's a sign that generative AI is moving firmly into three d.

With this so this is really, really cool.

It's they've changed the approach, and you can see in some of the, graphics they provide, just exactly how it works.

There was a where was it?

There was a quote that I wish I had saved because it really, like, filled in the gaps for me, in terms of how this works.

Let me see if I can find it real quick.

Now there is a graphic on this, page that I do recommend checking out, that kinda shows how it works.

And I think it kinda makes a little more sense in terms of, like, why it's more persistent rather than, being live generated and changing the context, you know, like I was saying with the Minecraft thing.

But I definitely recommend checking this out.

This one's from TechCrunch written by Rebecca Belen, and truly, really exciting, news in the, I guess, in the world of computer vision.

Now, hopefully, you know, in terms of our concerns of ethical image usage and whatnot, a lot of that has been taken into account here given Faith Elise's reputation and her dedication to ethical AI.

But I'm sure to some degree there's, you know, some stuff that slipped through that should not have.

I hope not, but, I'm curious to see how that is I'll have to look into that to see how that's guaranteed.

You know?

If if to make make it so that people wanna use the model, especially because a big concern with generative stuff right now is stealing the content contents of other people's work to create this new amalgamation.

But on the other hand, this, this tech this kind of tech would dramatically show the time it takes to build, like, virtual worlds.

So if you guys are into gaming, simulations, even possible, like, AR training routines and digital content creation, it it could probably help with, a lot of filling in the gaps in terms of, you know, even just creating scenes for your podcast or something.

You know, like, you could generate a background scene for your, Elgato camera, which could use that as your, you know, green screen image, which is kinda cool.

That was something I was thinking about trying out, but I haven't been too much doing too much of the recording, so it didn't really seem too important at the time, currently at this point in time.

But if I do start doing more recording, hopefully, I can set up, a nice, background instead of just, like, my my bedroom.

But, yeah, I I definitely recommend checking this article out.

I think this is gonna be some really groundbreaking stuff.

And now if any of you are into programming or keep up with software development news, this next article is from the GitHub blog, TypeScript's rise in the AI era, insights from lead architect Anders Heilsberg.

Hopefully I said that right.

Now a bit of context.

Gabe's latest, developer data shows that TypeScript has kinda overtaken both JavaScript and Python as the most used programming language on the platform.

Much of this growth is tied to AI assisted coding, which benefits heavily from typed languages that reduce errors at scale.

Now I wonder if this is, you know, if we're talking how AI assisted coding are we talking about that we're gonna that that it requires us to strongly type to reduce errors, you know, and I I don't know.

I I I guess I would have to look into this more to kinda understand how this works.

But I assume it helps on a, in terms of, generative code and enforcing, you know, type rules for specific objects in your system.

But I wonder if it's just that having those types built into the language makes the the generative model more, like, it makes it more implicit rather than using a very broad language like Python, which can you you have to be specific or be good about, you know, your your type context.

I don't know.

I wonder.

So in twenty twenty five, TypeScript became the most used language on GitHub.

More than a million developers began contributing in TypeScript this year alone, which is a sixty six percent jump.

So it this is due to its mix of strong typing, the JavaScript compatibility, and synergy with AI code generation tools.

It's a notable shift in the programming landscape.

And I think, if I remember correctly, TypeScript compiles to JavaScript.

So it's almost like safe setup and clean knockdown.

Right?

Because, you know, most of the Internet's already using JavaScript.

I think this is, TypeScript's the superset of JavaScript.

Right?

But so, I wonder.

I've never, how do I put it?

I don't know if I've ever consciously used TypeScript, but if it's a superset of JavaScript, I I wonder if it's just, more specific JavaScript.

I'll have to look into this a little bit more.

But if you're in tech or aspiring to be in tech, I mean, I'm sure you know, this is the first time in years Python's been overtaken, especially with the AI boom going on right now, and how much support there is in Python.

It's crazy to see that we see this shift because generative AI is able to write cleaner code with, with strongly typed languages.

So, yeah, we'll see how this goes in the next few years.

Even as we're seeing more of these code generative tools come up and more people shifting away from the classic IDE and into things like cursor based IDs, which have more of an agentic feel to them.

At least it's for from what I've read.

I actually really need to try the cursor ID because I've heard a lot of good stuff about it, and some of the other, you know, auto context protocol based generative AI coding tools now.

I really haven't tried too much of it at all.

And even just seeing the tip of the iceberg with the use of chatbots, like, that's only the beginning.

The agents is is where it's at, you know, really dialing down the context in which you're using it.

This next article, also really cool.

Kinda crazy.

Let's go with it.

World's first mass humanoid robot delivery begins as UbiTech sends walker s two units.

So Chinese company UbiTech or Chinese robotics company UbiTech has delivered hundreds of its Walker s two humanoid robots to factories and manufacturers, including major names like BYD and Foxconn.

This marks the first time humanoid robots have been developed at scale in real world industry environments.

I I wonder if this is what the robots actually look like because the front page image here on interesting engineering of this article written by Sujita Sinha.

Hope I said that right.

The the image here is is kinda crazy.

I I don't know.

I was just talking about this with one of my coworkers actually the other day.

And, you know, in terms of AI assisted tools, like, do you need a whole human robot body to help you wash your dishes or whatever we're gonna have this thing do?

I'd rather have a device that works in sink that washes the dishes, you know, rather than have a robot walk over to my sink and and and do it there.

I I don't know.

I I feel like to a degree we also personify or put these put our human emotions on them, and develop some sort of hopeful belief that they also have feelings too regardless of what we're told or what we're led or or what science says.

Right?

And I think just seeing them as humanoid robots is gonna make us feel bad for them to a degree, trying to making someone do all that work.

But I wonder.

Who knows?

We'll see we'll see how common this becomes.

Ubitec says it's completed the first world world's first large scale shipment of human humanoid robots to active factories, marking a milestone in the industry.

These units are designed to take on human like tasks on manufacturing floors, reflecting a shift from robotics demos to real factory integration.

Oh my goodness.

Some major companies are driving eight hundred million yuan orders in orders for UbiTech's humanoid robots.

So a lot of companies are already saying I want this, which honestly could be pretty concerning for the Chinese labor force, you know, if they all get rep replaced by robots and what are what are they gonna do for work?

Are we just gonna keep giving them that money?

You know?

I I wouldn't think so.

So that's also a major concern in terms of their economy too would be the what happens with all the workers that are gonna be replaced by these robots.

But we'll see.

This is a new major new phase of automation.

Even if you don't work in manufacturing, the presence of humanoid robots in factories will influence job markets.

Supply chains and public conversation about robotics and labor is gonna be a huge thing in the next few years because we have to think ethically about the people that we're replacing here and the jobs that are gonna be taken up by these robots.

And, yes, it's one less job that has to be done, but it's also one less person that gets paid.

We we don't really operate in a system that provides for people that get their jobs replaced by robots.

So I wonder I wonder how that might, affect their their public and their population.

Well, let's move on.

I got one last one for you folks today.

This one's from Emerge, decrypt dot co.

So it looks like it's a decrypt article.

I didn't realize, but, hey yo.

This is by Jose Antonio Lance.

Running your own local open source AI model is easy.

Here's how.

Now recently, I've definitely been doing a lot of searches into getting a new computer, getting some new parts.

Man, prices have skyrocketed.

Kill me.

Kill me.

Those prices have skyrocketed.

I think the fifty seventy went from eight hundred.

The GPU went from eight hundred on average to about thirteen or fourteen I'm seeing right now.

So give me a break.

Right?

But here's a little bit of context into this article.

Powerful open source AI models like Llama, Mistral, and Fi are now efficient enough to run on personal computers.

New tools such as Ollama and LM Studios make setup simple, allowing people to use AI locally without cloud services or subscription fees.

Now one of the biggest concerns about using these models, from these big providers like Anthropic and OpenAI is that you are basically agreeing to allowing to have your data taken and trained.

Not everyone wants that.

And a lot of people would love to run these tools privately.

And it's kinda cool to kinda see this.

I I wanna do that.

So that's why I'm looking into new PCs and stuff and getting something ready, hopefully so I can get that all nice and launched and have a little private LLM running from this bad boy that I can prompt from anywhere.

Like, that would be cool.

So that's in the works, but something to think about if you, have a nice setup at home and you can run a little something.

But a growing wave of tools is making it easy for anyone to run their own AI model entirely offline.

With just a standard PC, users can load open source models and work privately without paying per request cost or sharing data with cloud providers.

Local AI monitors offer local AI models offer privacy and zero subscription costs, letting you run powerful models completely offline.

So, some of the models are called Lama, Mishra, or Phy.

It can be run privately, and you can run those through, tools like Ollama.

So that was the one I was looking into and planning to use, but there's another one here that's mentioned called LM Studio.

And if you're a heavy user that that works with private data, I definitely recommend that you shift in this direction and and put put that data into, your own computer instead of providing that to a cloud provider.

I think that just makes sense.

You know?

It's a little bit ridiculous to allow someone to take all your stuff and train their own, machine on it, further their own profits.

You know?

Keep that stuff private.

But I'm sure we're gonna see more and more of these tools come out that make it easier and easier.

The shift is gonna give a lot of users more control on their, LLM usage, lower costs, better privacy, no reliance on big tech infrastructure or a massive cloud server.

And I think that this is kind of hinting that it it is possible to have your AI tools locally and running at, on your own, you know, dime, basically, without having the, requirement for a data center to be available to run your models and boiling some water.

You know?

So let's see.

We'll see.

I'm excited about this one.

This is something I've been definitely been looking into a lot recently and have been excited to get running on myself.

But, yeah, you know, cost constraints and whatnot, tend to be the number one reason to start or to to prevent you from even starting.

So that's in the works, but we'll see, especially with the current prices being as they are.

I wanna thank you guys again for tuning in for the show, folks.

That's really all I got for you tonight.

Sorry again for the long delay on the new episodes.

I'm recording this on Sunday, November twenty third.

I just celebrated a birthday, so if you guys wanna send me a happy birthday, drop it in the comments or something.

I don't know.

But we'll try to be a little more consistent about the shows.

I really do have to talk to Chris about nailing down which days of the week.

We have initially thought it thought that we wanted to do Monday and Thursday.

But I'm I just am completely unavailable right now during the week.

And Saturday or Sunday mornings just seem like the best time for me to be able to record a show.

So I'm gonna be leaning a little bit closer to that.

If you guys would still want me to post it on Monday and just record on Saturday or Sunday, that could that could totally be do doable too.

But my idea is if I already have it recorded, why are we holding back?

You guys might as well be able to listen to it.

So let me know what you think.

Send us an email at geek at geek news central dot com or geek news at g mail dot com.

Definitely wanna hear your feedback, and drop us some, comments.

Let us know what you guys think of the episode.

If you guys wanna hear more, AI stuff, more space stuff, we're happy to to kinda lean into the feedback and look for more articles leaning into the direction that you guys want to see.

So let us know.

Thanks again for tuning in.

You guys have a phenomenal night, and we'll catch

Never lose your place, on any device

Create a free account to sync, back up, and get personal recommendations.