loobkoob

joined 2 years ago
[–] [email protected] 6 points 2 years ago (3 children)

"Mystery box" storytelling is the name for it and, yeah, Lost, especially, is the poster child for not executing on it particularly well. It can be exciting, and it does a good job of making following a story feel like a communal experience that everyone can participate in - speculating on where things will go next, for instance - but it also often feels like shows using it end up over-promising and under-delivering (and often leaves viewers feeling a little soured at the end).

I feel like Dark was a good example of it being well-executed, and proves it certainly can be done well. But yeah, BSG definitely didn't end up paying off for me either.

[–] [email protected] 3 points 2 years ago

I agree completely. I think AI can be a valuable tool if you use it correctly, but it requires you to be able to prompt it properly and to be able to use its output in the right way - and knowing what it's good at and what it's not. Like you said, for things like brainstorming or looking for inspiration, it's great. And while its artistic output is very derivative - both because it's literally derived from all the art it's been trained on and simply because there's enough other AI art out there that it doesn't really have a unique "voice" most of the time - you could easily use it as a foundation to create your own art.

To expand on my asking it questions: the kind of questions I find it useful for are ones like "what are some reasons why people may do x?" or "what are some of the differences between y and z?". Or an actual question I asked ChatGPT a couple of months ago based on a conversation I'd been having with a few people: "what is an example of a font I could use that looks somewhat professional but that would make readers feel slightly uncomfortable?" (After a little back and forth, it ended up suggesting a perfect font.)

Basically, it's good for divergent questions, evaluative questions, inferent questions, etc. - open-ended questions - where you can either use its response to simulate asking a variety of people (or to save yourself from looking through old AskReddit and Quora posts...) or just to give you different ideas to consider, and it's good for suggestions. And then, of course, you decide which answers are useful/appropriate. I definitely wouldn't take anything "factual" it says as correct, although it can be good for giving you additional things to look into.

As for writing code: I've only used it for simple-ish scripts so far. I can't write code, but I'm just about knowledgeable enough to read code to see what it's doing, and I can make my own basic edits. I'm perfectly okay at following the logic of most code, it's just that I don't know the syntax. So I'm able to explain to ChatGPT exactly what I want my code to do, how it should work, etc, and it can write it for me. I've had some issues, but I've (so far) always been able to troubleshoot and eventually find a solution to them. I'm aware that if want to do anything more complex then I'll need to expand my coding knowledge, though! But so far, I've been able to use it to write scripts that are already beyond my own personal coding capabilities which I think is impressive.

I generally see LLMs as similar to predictive text or Google searches, in that they're a tool where the user needs to:

  1. have an idea of the output they want
  2. know what to input in order to reach that output (or something close to that output)
  3. know how to use or adapt the LLM's output

And just like how people having access to predictive text or Google doesn't make everyone's spelling/grammar/punctuation/sentence structure perfect or make everyone really knowledgeable, AIs/LLMs aren't going to magically make everyone good at everything either. But if people use them correctly, they can absolutely enhance that person's own output (be it their productivity, their creativity, their presentation or something else).

[–] [email protected] 28 points 2 years ago (2 children)

It's not an ad. The entire comic is setting up a pun around the male character feeling like he's dodged a bullet.

Its mocking overly-expensive weddings. It's mocking overly-expensive cars/trucks. It's mocking self-centred people in relationships who are oblivious to their partners' perspectives. It's mocking people who have no financial sense and see their wants as needs but are happy to dismiss their partners' wants as unimportant. It's mocking car adverts. But mostly, it's just a silly pun.

[–] [email protected] 44 points 2 years ago

They used "bullet" because the entire comic is setting up the joke about having "dodged a bullet".

[–] [email protected] 3 points 2 years ago

My advice is don’t look to date

I think, even if you have the long-term intention of finding someone to date, this is the best approach. Not only does it mean you totally avoid coming off as desperate, but I think if you're actively looking to date then it can result in you holding them to ideals or standards they're not looking to or necessarily able to meet. And it can limit the connections you can form - both people to date and just new friendships - because you find yourself dismissing people who don't meet your pre-established idea of what you're looking for.

The fewer expectations you can place on someone, the more chance you have of forming a connection.

[–] [email protected] 64 points 2 years ago (10 children)

Nah, Scott Adams is a hateful bigot. He thinks black people are a "hate group" - he truly went off the rails.

I don't really think this comic reflects its author's personal views at all. C&H has always been filled with shock comedy, black comedy, deliberate insensitivity, and silly puns, and everything is a target. This one doesn't really stand out as any different to how the comic's always been.

I don't really feel like there's ever really been a right-wing slant to these comics either. And I say that as someone who's ardently left-wing.

[–] [email protected] 6 points 2 years ago (1 children)

My thoughts exactly. I wish it'd been stolen instead, or something else just generally less environmentally damaging.

[–] [email protected] 33 points 2 years ago (9 children)

I don't think AI will be a fad in the same way blockchain/crypto-currency was. I certainly think there's somewhat of a hype bubble surrounding AI, though - it's the hot, new buzzword that a lot of companies are mentioning to bring investors on board. "We're planning to use some kind of AI in some way in the future (but we don't know how yet). Make cheques out to ________ please"

I do think AI does have actual, practical uses, though, unlike blockchain which always came off as a "solution looking for a problem". Like, I'm a fairly normal person and I've found good uses for AI already in asking it various questions where it gives better answers than search engines, in writing code for me (I can't write code myself), etc. Whereas I've never touched anything to do with crypto.

AI feels like a space that will continue to grow for years, and that will be implemented into more and more parts of society. The hype will die down somewhat, but I don't see AI going away.

[–] [email protected] 2 points 2 years ago

Thanks, I'm glad you found it useful! Unfortunately, I don't have any specific recommendations for further reading - it's the sort of thing I've just picked up over the years of paying attention to politics and economics rather than from any specific sources.

However, after a quick search, the US Treasury actually has a pretty good page about it.

The key graph to look at on that page is the one that shows the debt to GDP ratio (which, as it explains, is the most useful way to measure debt because it shows the country's ability to repay its loans). And you can see, despite the fact that Biden's administration has hit the highest ever raw national debt number, they've actually done a pretty good job at stabilising this ratio after the disaster that was Trump's term.

(To be fair to Trump, COVID was responsible for some of that big upwards spike. But you can see it was trending upwards - ie, bad - under Trump before COVID started.)

Which means that, yes, the national debt has continued to rise in real numbers under Biden, but the US economy has also been growing at a similar rate, meaning nothing's got any worse in real terms.

The BBC also has a pretty good overview of national debt. It's relating to the UK but the principles are still the same.

[–] [email protected] 7 points 2 years ago (2 children)

The significant thing, which is harder to gauge, is how much return the country gets on its borrowing. If you borrow $1B for some infrastructure project, to be paid back over 25 years, but over those 25 years the infrastructure project generates $1.5B in taxes, job opportunities, time saved, etc, and then it continues to generate more money even after the debt has been paid off, then taking on the debt was a good thing in the long run. Sure, the national debt went up, and generally owing money is seen as bad, but it was profitable overall and the country's GDP maybe have increased at a higher rate than its debt over that period.

And it gets even more complex when you consider less tangible results like the happiness of the population. If the country breaks even on its borrowing but the population's happiness is increased, I think most people would consider it a success, but it's hard to measure that.

A lot of the people who get overly panicky about national debts don't tend to understand them. They treat them like household budgets, or like businesses, where you absolutely need to balance your books every year or you're in trouble. With national debts, as long as the country can keep paying back its lenders and as long as they continue to grow (economically), national debts aren't that big a deal at all.

Generally, left-wing economics tends to lean towards borrowing as an investment into the future, and isn't too concerned with the actual debt number increasing as long as things are improving at a good rate. Right-wing economics tends to lean towards being loud about the national debt increasing and lean towards austerity and other cuts in order to "balance the books", often calling the left-wing approach "irresponsible". Personally, if it's not obvious, I favour the left-wing approach.

[–] [email protected] 6 points 2 years ago

It absolutely is. Although, putting aside the obvious ethical debates, I will say that least AI has some practical uses. Crypto-currency and NFTs felt a lot like a solution looking for a problem, and while that can be true of some implementations of AI, there are a lot of valid uses for it.

But yeah, companies rushing to use AI like this, and making statements like this, just screams that they're trying to persuade investors they're "ahead of the curve", and is absolutely indicative of a hype bubble. If it wasn't a hype bubble, they'd either be quietly exploring it externally and not putting out statements like this, or they're be putting out statements excitedly talking specifics about their novel and clever implementations of AI.

[–] [email protected] 2 points 2 years ago

I'm not sure if The Expanse (TV series) ruined Foundation (TV) for me, if it's just not a good adaptation, or if the books are just not particularly adaptable (or all three), but I agree. I only made it through the first two episodes before I gave up. I've heard the second season is better, but I don't know if it's worth it to force myself to sit through season 1 for.

The Expanse is just spectacular when it comes to realising its world but also, with how much depth there is to the characters and politics, Foundation immediately felt very shallow in comparison. Obviously The Expanse books lay a lot of the foundations for the TV series to build on, but I think the TV series did a great job of adapting it to a new medium without much being lost in translation, and it even added to it in its own ways. Foundation's world-building, characterisation and politics all kind of just felt like it was going through the motions and showing surface-level stuff because it felt it had to rather than because it actually had any substance to work with. Which wasn't helped by the fact that the books don't provide much in that regard to work with.

Ultimately, I don't think the Foundation books aren't particularly well-suited to being adapted to the screen. It's so focused on the "bigger picture" - on civilisations rather than characters, on philosophical and sociological concepts rather than particular plot points, on macro-narrative - while TV needs characters and micro-narrative.

I will say that the TV series' idea to use three different-aged clones of Emperor Cleon, and to keep the actors persistent through the ages, seemed like a great addition. It's good to try to keep some recognisable faces while jumping across such long time periods.

view more: ‹ prev next ›