Googlers against genocide' lead sit-ins, protests coast-to-coast at tech giant's offices

Forum rules
Keep News and Politics about News and Politics.

Do not post full articles from other websites. Always link back to the source

Discuss things respectfully and take into account that each person has a different opinion.

Remember that this is a place for everyone to enjoy. Don’t try and run people off of the site. If you are upset with someone then utilize the foe feature.

Report when things come up.

Personal attacks are against guidelines however attacks need to be directed at a member on the forum for it to be against guidelines. Lying is not against guidelines, it’s hard for us to prove someone even did lie.

Once a topic is locked we consider the issue handled and no longer respond to new reports on the topic.
User avatar
Quorra2.0
Regent
Regent
Posts: 4854
Joined: Wed Nov 21, 2018 10:39 am

Unread post

WellPreserved wrote: Thu Apr 18, 2024 8:28 am
Quorra2.0 wrote: Thu Apr 18, 2024 12:23 am
Slimshandy wrote: Wed Apr 17, 2024 5:56 pm This one I agree with…
Look up the Lavender program… they’re having AI decide which sites get bombed without humans pushing the button…



Like no one saw The terminator…
I don’t think AI should have a prominent place, if any, in military operations. Not even because of movies like The Terminator, but because you are removing humanity where humanity is greatly needed.
AI being used in the Israeli program nick named "Where's Daddy" is pretty horrific but as posted, it can also be legitimately used in defense. I think AI needs its own category in rules of engagement and for goodness sake, prosecute and punish war crimes because without consequences there is no incentive to stop or another country to engage in the same.

I'm not sure but I would think "Lavender" is a war crime and if so, the US would have the legal imperative to tell Google to stop supplying tech to Israel. No protest needed.
When you remove humans you remove humanity. The distance it creates also creates a next level desensitization.
Momto2boys973
Princess
Princess
Posts: 20234
Joined: Wed May 23, 2018 5:32 pm

Unread post

Good. Because they were all fired.
Pjmm wrote: Wed Apr 17, 2024 10:09 pm
Momto2boys973 wrote: Wed Apr 17, 2024 9:55 pm Those protests you mentioned are protests to improve working condition. That’s different than “protesting” because you don’t like a deal the company made, which doesn’t affect you in the slightest. So just quit.

They were arrested, BTW

https://www.cnbc.com/2024/04/17/google ... ffice.html

There’s probably going to be some vacancies to work in Google 🤷🏼‍♀️
Pjmm wrote: Wed Apr 17, 2024 9:14 pm

Workers have protested companies since the industrial revolution. This is nothing new. It's why we have the FDA, the forty hour work week, and child labor laws. All from protests of one kind or another. As far as boycotting, I can stop giving money to Amazon. In fact I didn't renew my prime this year. I'm making it a point to buy from other companies. But given that Google is as I said, a monopoly, not using them is difficult. They've been accused of pushing other search engines out. They're in android phones and I'm not certain those apps can be removed. I can remove them from my iPhone but that leaves plenty of people without a choice. Plus, I'd have to delete my gmail, download all my documents and pictures as well as update God knows how many contacts and clients I do business with. Now yes I can do this. I also don't pay them although I'm sure they use my data for anything they like. That's a whole other issue. Anyway, Google is so much a part of our lives I think both workers and citizens have the right to ask what their practices are. It's fine if they have military contracts and develop AI for war. But I hoped AI would be to prevent soldiers from going into war, not to be used against civilians regardless of what side they're on. Now the demonstrators have admitted they don't know if Nimbus or whatever it is has been used against civilians. I think they should figure that out before they protest further. But I can't fault them for being concerned.
I'm sure these workers went in knowing that might be the outcome.
❤️🇮🇱 עמ׳ ישראל חי 🇮🇱❤️
Momto2boys973
Princess
Princess
Posts: 20234
Joined: Wed May 23, 2018 5:32 pm

Unread post

They’re not removing humans. It’s an aid, not a replacement. Let’s not jump into paranoid conspiracy theories here.
Every technology since the invention of the wheel has been used for good or bad. They said the same thing during the Industrial Revolution, that humans would be replaced. And yet here we are, working with the aid of machines. Thing is, if the good guys refuse to use it because it can be misused or abused, the bad guys will.
Quorra2.0 wrote: Thu Apr 18, 2024 10:39 am
WellPreserved wrote: Thu Apr 18, 2024 8:28 am
Quorra2.0 wrote: Thu Apr 18, 2024 12:23 am

I don’t think AI should have a prominent place, if any, in military operations. Not even because of movies like The Terminator, but because you are removing humanity where humanity is greatly needed.
AI being used in the Israeli program nick named "Where's Daddy" is pretty horrific but as posted, it can also be legitimately used in defense. I think AI needs its own category in rules of engagement and for goodness sake, prosecute and punish war crimes because without consequences there is no incentive to stop or another country to engage in the same.

I'm not sure but I would think "Lavender" is a war crime and if so, the US would have the legal imperative to tell Google to stop supplying tech to Israel. No protest needed.
When you remove humans you remove humanity. The distance it creates also creates a next level desensitization.
❤️🇮🇱 עמ׳ ישראל חי 🇮🇱❤️
WellPreserved
Donated
Donated
Princess
Princess
Posts: 10007
Joined: Sun Jan 19, 2020 9:52 pm

Unread post

Quorra2.0 wrote: Thu Apr 18, 2024 10:39 am
WellPreserved wrote: Thu Apr 18, 2024 8:28 am
Quorra2.0 wrote: Thu Apr 18, 2024 12:23 am

I don’t think AI should have a prominent place, if any, in military operations. Not even because of movies like The Terminator, but because you are removing humanity where humanity is greatly needed.
AI being used in the Israeli program nick named "Where's Daddy" is pretty horrific but as posted, it can also be legitimately used in defense. I think AI needs its own category in rules of engagement and for goodness sake, prosecute and punish war crimes because without consequences there is no incentive to stop or another country to engage in the same.

I'm not sure but I would think "Lavender" is a war crime and if so, the US would have the legal imperative to tell Google to stop supplying tech to Israel. No protest needed.
When you remove humans you remove humanity. The distance it creates also creates a next level desensitization.
I agree and Google did too when they dropped Project Maven due to employee pressure.

Google's AI principles are definitely being questioned and tested.

"AI applications we will not pursue
In addition to the above objectives, we will not design or deploy AI in the following application areas:

1. Technologies that cause or are likely to cause overall harm. Where there is a material risk of harm, we will proceed only where we believe that the benefits substantially outweigh the risks, and will incorporate appropriate safety constraints.

2. Weapons or other technologies whose principal purpose or implementation is to cause or directly facilitate injury to people.

3. Technologies that gather or use information for surveillance violating internationally accepted norms.

4. Technologies whose purpose contravenes widely accepted principles of international law and human rights.

As our experience in this space deepens, this list may evolve."
"The books that the world calls immoral are books that show its own shame." - Oscar Wilde
Della
Princess
Princess
Posts: 22331
Joined: Sun Jun 03, 2018 12:46 pm

Unread post

Momto2boys973 wrote: Wed Apr 17, 2024 9:55 pm Those protests you mentioned are protests to improve working condition. That’s different than “protesting” because you don’t like a deal the company made, which doesn’t affect you in the slightest. So just quit.

They were arrested, BTW

https://www.cnbc.com/2024/04/17/google ... ffice.html

There’s probably going to be some vacancies to work in Google 🤷🏼‍♀️
Pjmm wrote: Wed Apr 17, 2024 9:14 pm
Momto2boys973 wrote: Wed Apr 17, 2024 8:34 pm Learn about it, by all means. And feel free to disagree with whatever choices they make. But don’t presume as an employee that you have a saying in that choice. You don’t like it? Quit. But how entitled do you have to be to tell the company to abandon a 2.1 billion deal because you have moral objections?
Just as a consumer. If you object to those companies’ actions, then stop giving them your money. Somehow I’m thinking all those who are so outraged won’t do that, though 🤷🏼‍♀️

Workers have protested companies since the industrial revolution. This is nothing new. It's why we have the FDA, the forty hour work week, and child labor laws. All from protests of one kind or another. As far as boycotting, I can stop giving money to Amazon. In fact I didn't renew my prime this year. I'm making it a point to buy from other companies. But given that Google is as I said, a monopoly, not using them is difficult. They've been accused of pushing other search engines out. They're in android phones and I'm not certain those apps can be removed. I can remove them from my iPhone but that leaves plenty of people without a choice. Plus, I'd have to delete my gmail, download all my documents and pictures as well as update God knows how many contacts and clients I do business with. Now yes I can do this. I also don't pay them although I'm sure they use my data for anything they like. That's a whole other issue. Anyway, Google is so much a part of our lives I think both workers and citizens have the right to ask what their practices are. It's fine if they have military contracts and develop AI for war. But I hoped AI would be to prevent soldiers from going into war, not to be used against civilians regardless of what side they're on. Now the demonstrators have admitted they don't know if Nimbus or whatever it is has been used against civilians. I think they should figure that out before they protest further. But I can't fault them for being concerned.
This isn't about a company making a deal. This is about a company making inhumane uses for it's service.

🤑
306/232

But I'm still the winner! They lied! They cheated! They stole the election!
User avatar
Quorra2.0
Regent
Regent
Posts: 4854
Joined: Wed Nov 21, 2018 10:39 am

Unread post

It’s not paranoid conspiracy theories. It’s predictable natural consequences.

AI in the military isn’t new. And yes, it has been utilized as an aid but that’s no longer where it’s stopping. This isn’t paranoid conspiracy theory, keep up, this is information that has been announced not only by Israel but several other countries as well. This isn’t sometime in the way distant, after our great grandchildren are sitting on porches old in rockers future, it’s near future. What have you thought they meant when they talk about combat capable UGVs and UAVs that are outfitted with AI, machine learning, and will become autonomous? I’m not sure what completely autonomous after it’s sufficiently learns means to you with regard to heavily armed machines.

Distancing humans is removing humans from the direct action. Studies have already found that drone operators burn out substantially faster than direct action personnel. While ptsd is marginally lower, drone operators experience significantly higher levels of emotional disengagement beyond job performance that have other serious ramifications such as severe apathy. There have also been studies that with drones, the distancing isn’t just with operators but how militaries view and address civilian casualties. It’s gone from civilians can sometimes be the tragic collateral damage of war to civilians are an accepted collateral damage of war. Drones are so capable in their strike precision that civilian casualties should be more of a rarity but aren’t.

Historically, whenever the “good guys” have made advancements, it’s accelerated the “bad guys” access to the same advancements and to me, virtually unhackable isn’t the same as completely unhackable.

Just because we can do something doesn’t mean we should. There are lines that shouldn’t be crossed. Hence why humans aren’t being cloned.



Momto2boys973 wrote: Thu Apr 18, 2024 12:17 pm They’re not removing humans. It’s an aid, not a replacement. Let’s not jump into paranoid conspiracy theories here.
Every technology since the invention of the wheel has been used for good or bad. They said the same thing during the Industrial Revolution, that humans would be replaced. And yet here we are, working with the aid of machines. Thing is, if the good guys refuse to use it because it can be misused or abused, the bad guys will.
Quorra2.0 wrote: Thu Apr 18, 2024 10:39 am
WellPreserved wrote: Thu Apr 18, 2024 8:28 am

AI being used in the Israeli program nick named "Where's Daddy" is pretty horrific but as posted, it can also be legitimately used in defense. I think AI needs its own category in rules of engagement and for goodness sake, prosecute and punish war crimes because without consequences there is no incentive to stop or another country to engage in the same.

I'm not sure but I would think "Lavender" is a war crime and if so, the US would have the legal imperative to tell Google to stop supplying tech to Israel. No protest needed.
When you remove humans you remove humanity. The distance it creates also creates a next level desensitization.
Post Reply Previous topicNext topic