It also has a 1v1 mode (player vs computer or PvP) that is just fantastic. I actually spent most of my time playing the 1v1 mode way back in the day.
It also has a 1v1 mode (player vs computer or PvP) that is just fantastic. I actually spent most of my time playing the 1v1 mode way back in the day.
If I’m not mistaken, a “militia” was understood to be an ad hoc, non-standing armed group, supplied by the resources of its members. The amendment was added so that if a militia were ever needed (again), it could be formed, because the pool of potential militia members had their own firearms. Laws limiting citizen access to firearms would hobble any new militia.
Given that armies at the time were only recently becoming “standing” (permanent) armies, and the U.S. didn’t really have one, their best option for making war was militias. They were acutely aware that the revolution began that way, and only later developed an actual (organized, separately supplied, long-term) army.
But very quickly, the U.S. developed permanent armed forces and never had to rely on militias again. At that point the 2nd amendment really should have been obsolete.
Or free Black people.
They weren’t quite the sharpest tacks in the box.
It doesn’t help that the sentence makes no sense. The second clause requires that the first be the subject of the sentence, but then the third clause starts with a new subject, and lastly there’s that weird “German” comma after “Arms.”
There’s more than one way to interpret the meaning, but strictly speaking the only syntactically accurate rendering comes out roughly as:
[The right to] a well regulated Militia shall not be infringed, as it’s necessary to the security of a free State (security meaning the right of the people to keep and bear arms).
…which is also meaningless.
It’s a stupid amendment for lots of reasons, but the big one is that it’s just shitty English.
“I lost a brother once. I was lucky. I got him back.”
“I thought you said men like us don’t have families.”
“I was wrong.”
It makes sense why a Starbucks would be across the street from a Starbucks (coffee buyers are not, as a rule, brand-loyal, so they will go to the nearest/most easily accessible spot - so Starbucks grows like a weed to prevent other shops from taking the business of fickle customers). But two Apple Stores cheek to jowl… that’s weird.
At what point does the world look at this and say that enough is enough.
Do we ever, really? Over the sum of all war-related humanitarian disasters, the West responds to very few of them, and only when it’s economically or geopolitically useful. The Palestinian crisis is no different; it’s not exceptional in any way. There’s an ongoing nightmare in DRC that’s orders of magnitude worse than what’s happening in Gaza and… no one cares. Europe and the U.S. are on the verge of disengaging from Ukraine.
The thing is, it doesn’t even matter if we “condemn this behavior.” We could do that all we want and it wouldn’t make much difference. And no one wants to be interventionist - there’s too much awful history around it, and it smacks of colonialism, and it means taking resources away from “domestic issues” that always seem to matter more.
We’ve got to move away from the notion that the situation in Gaza is somehow unique. It allows us to conveniently ignore the root causes of the problem, which is much more universal, and stems from the ongoing sense of cultural superiority on the part of Europe and the U.S.
I’m curious: is this still a thriving community? Intel-based Macs are on the verge of being fully deprecated by Apple, so Hackintoshes will (within a year or two) be little more than “vintage computers.” Sure, you might manage to make one more cheaply or more powerful than an Intel Mac, but at some point there isn’t much that’s going to run. Already they’re stuck with older OS releases.
Duct. Duck is a brand name
Yes. But also mostly no.
Wikipedia:
“Duck tape” is recorded in the Oxford English Dictionary as having been in use since 1899 and “duct tape” (described as “perhaps an alteration of earlier duck tape”) since 1965
and:
In 1971, Jack Kahl bought the Anderson firm and renamed it Manco. In 1975, Kahl rebranded the duct tape made by his company. Because the previously used generic term “duck tape” had fallen out of use, he was able to trademark the brand “Duck Tape” and market his product complete with a yellow cartoon duck logo. Manco chose the term “Duck”, the tape’s original name, as “a play on the fact that people often refer to duct tape as ‘duck tape’”, and as a marketing differentiation to stand out against other sellers of duct tape.
People should really do the bare minimum double-check before showing their whole ass.
As others have noted, “duct tape” is the last thing you want to use on ducts. Better to actually call it “duck tape,” as it was for the first 65 years of its existence.
It’s fish and children, isn’t it?
Welcome back to the 1980s!
Bots aren’t a “problem” for Twitter unless the advertisers think there are more of them than there are real users. But if you can convince advertisers that you’re reducing bots, while also not actually reducing bots, you’ve got a winning formula. Bots are reliable posters, they contribute a lot more than a regular user, and they make high-engagement tweets/posts/tweex that end up getting a lot of views, aka advertising opportunities.
In other words the idea might have the opposite effect - keeping potential new human users out, but allowing the bots in
The galaxy brain shit here is that I suspect the bot problem actually doesn’t concern Musk in the way he claims. If he can make it seem like there are fewer bots (because of these policies) while at the same time not actually getting rid of them, the engagement level stays up and the advertisers are happy in their ignorance. Bots are better users: they’re not fickle, they don’t go to sleep, they can be reliably expected to be posting more regularly than normal users. The trick for Musk is convincing everyone they’re gone.
Stick with “American” because not only is it partially accurate geographically, it’s completely accurate in terms of how self-centered we are as a nation as to think we’re the only ones who count.
@JohnnyCanuck is right in a bunch of important ways, but there is one additional factor to consider. The reason the Hollywood guild system works the way it does is because no one is contracted to any given studio. It used to be that actors and writers were required to have locked-in contracts - they couldn’t work for anyone else - but that hasn’t been true for a long time. (There are exceptions: writers and actors can choose to have multi-picture/script deals, in exchange for an up front wad of cash, but it’s not the norm outside of the really heavy hitters.)
A standard union protects a worker’s existing job, and helps that worker negotiate terms for an existing job.
A Hollywood guild protects a worker’s future jobs - because the one they have now will absolutely not be the one they have in 2 years, a year, maybe even in 6 months. This is the nature of the Minimum Basic Agreement (MBA): it dictates minimum terms of employment. It’s not designed to give writers/actors the best deal, it’s designed to give them the least shitty deal the studios will agree to.
Why does this matter?
It matters because what most people think of as “Hollywood” is all the extremely pretty, extremely powerful, extremely prolific actors and writers who make lots of money and show up on magazine covers and in media podcasts. (No writer is showing up on a magazine, I don’t care how pretty he is.) But the MBA is there for the day players, the low rung people, the staff writers, the gal who had one spec script produced in her career so far.
What the WGA managed to achieve recently with its negotiations is an absolutely phenomenal success. But it still only really impacts the MBA - the minimum basic agreement!
So… uh… why does this fucking matter?
The game industry doesn’t really have superstars. It doesn’t have the equivalent of Tom Cruise and John August. At least not at scale. And the ones who are that shiny are usually studio heads or creative directors, not “employees.” So they wouldn’t be covered by a union anyway (which cannot apply to managers - i.e. anyone who has authority over other workers).
Suggesting that the game industry adopt the Hollywood guild model is to suggest forcing a pear into a box shaped like an apple. The MBA protects low level employees in their future employment, and isn’t really all that great - at least not the way most non-insiders think. It still results in a ridiculous number of workers making poverty wages.
Is that what you want a game voice actor to have? A minimum basic agreement for their future employment? A programmer? A graphic designer?
No. You want them to be in a union.[1] Which will protect their current jobs and create conditions for advancement, sufficient income at the lowest tiers and long term stability. None of which the Hollywood guilds really do.
[1] The distinction between a union and a guild isn’t a “real” one in modern U.S. law, strictly speaking. But conceptually, as above, a union is for people in regular employment with a single employer, and a guild is for (effectively) contract workers. The terminology of “guild” came from the older, pre-industrial idea of “the X workers guild” (masonry, carpentry, bricklaying, etc.), which were really just social organizations that sorta kinda acquired enough power to flex their muscles against the people who were contracting them by having minimum demands in solidarity within the guild (does that sound familiar…?). Guilds eventually “became” unions in the modern sense, once people were working with single employers over a long term. Put simply (and a bit stupidly), unions make contracts between workers and companies; guilds make contracts between workers and their industry. Part of the reason gig workers (Uber/Lyft/etc.) in California have been more active about getting better terms is because that state is super familiar with how guilds work, which is exactly what gig workers need, since their employment is with the industry as a whole (they can work for more than one company), not so much with a specific company. (It’s also why they’re having a much harder time - because California employers are super familiar with all the shenanigans Hollywood studios use to suppress the guilds that feed into them.)
Fun story! They came to that conclusion because they discovered a text which had what they believed was another very similar word (“epiousi”) that, in context, meant “necessary” or “enough for now.” That text was a shopping list.
Then the text got lost for a long time, and when they found it again, new eyes on it realized that they’d misread the word, so it was back to square one.
Textual critics are fairly confident that a fair amount of the texts of the New Testament were reliably copied until we get to the first extant manuscripts, and for the stuff that is very obviously messed up, they have a decent set of analytical tools that help them retroject the likeliest original wording. Not perfect, but decent.
Things were written mostly by eyewitness or those who interviewed eye witnesses.
The scholarly consensus is that this is not the case. The earliest written Gospel (Mark) couldn’t have been written any earlier than the occupation of the Temple during the First Jewish Revolt in 66-67, and all indications are that he was writing down traditions that came from his community and others, with no immediate connection to any “eyewitnesses.”
(Source: I have a PhD in this stuff.)
Ever gone through a Walmart checkout?
I’ve never seen longer nails than on those cashiers, and they have to press buttons and touch screens all the damn time.
NPR is not free; it’s paid for by taxes, which means that every U.S. citizen is in fact paying for news whether they like it or not. And “not for profit” is not the same as “no cost to the consumer.” In addition, most of the outlets for NPR are local public radio stations that are - you guessed it - funded by taxes (as well as fund drives).