• 0 Posts
  • 158 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle


  • DoD dropped it 7 and 3 pass requirements in 2006.

    Later in 2006, the DoD 5220.22-M operating manual removed text mentioning any recommended overwriting method. Instead, it delegated that decision to government oversight agencies (CSAs, or Cognizant Security Agencies), allowing those agencies to determine best practices for data sanitization in most cases.

    Meanwhile, the U.S. National Institute of Standards and Technology (NIST), in its Guidelines for Media Sanitization of 2006 (PDF), stated that “for ATA disk drives manufactured after 2001 (over 15 GB) clearing by overwriting the media once is adequate to protect the media.” When NIST revised its guidelines in late 2014, it reaffirmed that stance. NIST 800-88, Rev. 1 (PDF) states, “For storage devices containing magnetic media, a single overwrite pass with a fixed pattern such as binary zeros typically hinders recovery of data even if state of the art laboratory techniques are applied to attempt to retrieve the data.” (It noted, however, that hidden areas of the drive should also be addressed.)

    For ATA hard disk drives and SCSI hard disk drives specifically, NIST states, “The Clear pattern should be at least a single write pass with a fixed data value, such as all zeros. Multiple write passes or more complex values may optionally be used.”







  • “Free and open source software.” It’s an ethos that says that code should be free and open for people to use and improve as they see fit. The core of it is that if you modify any software that is FOSS, your software must also be FOSS. So overtime the software and what its used for improve, change, widen. Lucky for us, the movement has been ongoing for 50+ years, so it’s a mature ethos whose benefits are everywhere. Most of the internet runs on FOSS. Lemmy itself is FOSS.

    It doesn’t necessarily mean an app is more private, but it does mean you can generally self host, as the commentor said. There isn’t a profit motive with most FOSS, at least not at its core, so there is little desire to data harvest generally. There is also a heavy overlap between FOSS advocates and privacy advocates, so they tend to be more privacy conscious via local data storage or encryption.






  • Tons of questions here, but sure I’ll give it a go.

    Any autonomous or nearly autonomous hardware device would be taxed. Exceptions can apply. Maybe autonomous tractors are not taxed because food is needed, but unemployed farmers also need to be cared for.

    As to the code question and m365, maybe, maybe not. It may be reasonable to tax all cloud automation as a whole, or maybe just all SaaS, leaving IaaS and PaaS out of it. Exceptions may apply.

    The tax would be on the good or service forever, yes. If you displace human workers with automation, then thry need their basic needs met for human decency, but also so they don’t tear society to pieces, justifiably in my mind.

    Incumbent companies using automation may have an advantage, but only until they use a new robot or new automation. That advantage goes away if they are stuck 5-10 yr behind to avoid a tax. If they want to keep avoiding it, newer companies using taxed but getting a huge productivity booster will surpass them. That will incentivise them to use the tax producing goods or services and remove any initial advantage.

    I think I would also be okay with “no tax until you hit X automations” as well. You clearly can’t give tax breaks on employees, as not employing people will be the whole point of this, but you could likely work it out.


  • Most laws aren’t retroactive. If you do the thing before it’s illegal, then you skated by. That could very easily be the answer here, especially as most all the physical automation is barely existent. If a company deploys now, they don’t pay the tax, but they will when they upgrade models.

    As to code automation, same rules apply. Excel macros get by, but I would apply the tax on companies that replace white collar jobs via SaaS or other applications as their core businesses model, or for that line of buisness for vendors that do a lot of things. It would have to be refined as to where you draw the line, but you could.


  • mosiacmango@lemm.eetoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    54
    arrow-down
    3
    ·
    edit-2
    24 days ago

    It’s annoying, but at least this is an independent, worker owned 4 man outfit that got its start when Vice went bankrupt.

    Here is the article:

    For the past two years an algorithmic artist who goes by Ada Ada Ada has been testing the boundaries of human and automated moderation systems on various social media platforms by documenting her own transition.

    Every week she uploads a shirtless self portrait to Instagram alongside another image which shows whether a number of AI-powered tools from big tech companies like Amazon and Microsoft that attempt to automatically classify the gender of a person see her as male or female. Each image also includes a sequential number, year, and the number of weeks since Ada Ada Ada started hormone therapy.

    In 2023, after more than a year into the project which she named In Transitu, Instagram removed one of Ada Ada Ada’s self portraits for violating Instagram’s Community Guidelines against posting nudity. We can’t say for certain why Instagram deleted that image specifically and whether it was a human or automated system that flagged it because Meta’s moderation systems remain opaque, but it was at that moment that Instagram first decided that Ada Ada Ada’s nipples were female, and therefore nudity,  which isn’t allowed on the platform. On Instagram, shirtless men are allowed and shirtless women are also allowed as long as they don’t show nipples, so what constitutes nudity online often comes down to the perceived gender of an areola.

    “I’m really interested in algorithmic enforcement and generally understanding the impact that algorithms have on our lives,” Ada Ada Ada told me in an interview. “It seemed like the nipple rule is one of the simplest ways that you can start talking about this because it’s set up as a very binary idea—female nipples no, male nipples, yes. But then it prompts a lot of questions: what is male nipple? What is a female nipple?”

    In Transitu highlights the inherent absurdity in how Instagram and other big tech companies try to answer that question.

    “A lot of artists have been challenging this in various ways, but I felt like I had started my transition at the end of 2021 and I also started my art practice. And I was like, well, I’m actually in a unique position to dive deep into this by using my own body,” Ada Ada Ada said. “And so I wanted to see how Instagram and the gender classification algorithms actually understand gender. What are the rules? And is there any way that we can sort of reverse engineer this?”

    While we can’t know exactly why any one of Ada Ada Ada’s images are removed, she is collecting as much data as she can in a spreadsheet about which images were removed, why Instagram said they were removed, and to the best of her knowledge if the images’ reach was limited.

    

    That data shows that more images were removed further into her transition, but there are other possible clues as well. In the first image that was removed, for example, Ada Ada Ada was making a “kissy face” and squeezing her breasts together, which could have read as more female or sexual. Ada Ada Ada was also able to reupload that same image with the nipples censored out. In another image that was removed, she said, she was wearing a lingerie bra where her nipples were still visible.

    “But then again, you have this one where I’m wearing nipple clamps, and that didn’t do anything,” she said. “I would have expected that to be removed. I’ve also had another picture where I’m holding up a book, Nevada by the trans author Imogen Binnie. I’m just holding a book and that was removed.”

    Ada Ada Ada also maintains a spreadsheet where she tracks how a number of AI-powered gender classifiers—Face++, face-api.js, Microsoft Azure’s Image Analysis, Amazon Rekognition, and Clarifai—are reading her as male or female.

    Experts have criticized such gender classifiers for often being wrong and particularly harmful for transgender people. “You can’t actually tell someone’s gender from their physical appearance,” Os Keyes, a researcher at the University of Washington who has written a lot about automated gender recognition (AGR), wrote for Logic in 2019. “If you try, you’ll just end up hurting trans and gender non-conforming people when we invariably don’t stack up to your normative idea of what gender ‘looks like.’”

    “I’ve definitely learned that gender classifiers are an unreliable and flawed technology, especially when it comes to trans people’s gender expression,” Ada Ada Ada said. “I regularly see my algorithmic gender swing back and forth from week to week. In extension to that, it’s also fascinating to see how the different algorithms often disagree on my gender. Face++ (which is a Chinese company) tends to disagree more with the others, which seems to suggest that it’s also a culturally dependent technology (as is gender).”

    As Ada Ada Ada told me, and as I wrote in another story published today, continually testing these classifiers also reveals how they work in reality versus how the companies that own them say they work. In 2022, well into her project, Microsoft said it would retire its gender classifier following criticism that the technology can be used for discrimination. But Ada Ada Ada was able to continue using the gender classifier well after Microsoft said it would retire it. It was only after I reached out to Microsoft for comment that it learned that she and what Microsoft said was a “very small number” of users were still able to access it because of an error. Microsoft denied them access after I reached out for comment.

    Another thing that In Transitu reveals is that, on paper, Instagram has a plain policy against nudity. It states:

    “We know that there are times when people might want to share nude images that are artistic or creative in nature, but for a variety of reasons, we don’t allow nudity on Instagram. This includes photos, videos, and some digitally-created content that show sexual intercourse, genitals, and close-ups of fully-nude buttocks. It also includes some photos of female nipples, but photos in the context of breastfeeding, birth giving and after-birth moments, health-related situations (for example, post-mastectomy, breast cancer awareness or gender confirmation surgery) or an act of protest are allowed.”

    But in reality, Instagram ends up removing content and accounts belonging to adult content creators, sex educators, and gender nonconforming people who are trying to follow its stated rules, while people who steal adult content or create nonconsensual content game the system and post freely. As 404 Media has shown many times, nonconsensual content is also advertised on Instagram, meaning the platform is getting paid to show it to users. It’s not surprising that trying to follow the rules is hard when users struggle to reverse engineer how those rules are actually enforced, and nonsensical for people who don’t fit into old, binary conceptions of gender.


  • That’s the rub, isn’t it? From a society view, having manual labor all done by robots is also a positive game changer, as it protects human health with no loss in standard of living, but because we will just lay people off with no support, it will instead plunge our society into despair.

    The automation tax that gates/etc proposed to fund UBI/social support networks is making more and more sense.