meatbag

@meatbag@dragonscave.space

I'm a #tech enthusiast, local #foodie, and lover of staying alive 🤣. I'm passionate about using tech to make the world a better place and exploring #poetry. I'm also a big fan of local #food - nothing beats the taste of a good traditional dish.

This profile is from a federated server and may be incomplete. Browse more on the original instance.

meatbag, to linux

Linux blind users, listen to this and let me know what you guess this actually is, and most importantly, what you think! I'm just gauging interest as this is still in its early infancy, and I was wondering if it was worth continuing. As you can infer, I'm already far enough to have a working yet incomplete prototype!

#linux #accessibility #blind

meatbag, to streaming

Hi! I'm happy to announce that we'll be streaming Final Hour next Saturday! Join me, @blind_lightning @TheFake_VIP and Tunmi13 for some thrilling matches! It's going to be great, so come hang out with us and get all your questions answered in real-time. Don't forget to spread the word, and mark your calendars for next Saturday, February 17, 2024, at 8:30 PM UTC. We're counting down the days and can't wait to see you there!
The official stream at the lower elements club channel: https://www.youtube.com/watch?v=LRBFui4Qlik
The stream at The fake vip's channel: https://www.youtube.com/watch?v=E7ynjRUOA8c
The stream at Tunmi13's channel: https://www.youtube.com/live/hgWIml9vzeU?si=riyTsZRkhbzjcTji
Again, February 17, 2024 at 8:30 PM.
#FinalHour #Stream #streaming

meatbag, to random

Today is my birthday. I'm greatful to have friends who care enough to remember, and indeed greatful for any strangers who might decide to say happy birthday while passing by! :)

meatbag, to android

Hey, I've talked before about the major thing slowing down Android screen readers: double taps! Just to recap: when you tap the screen, the screen reader waits for some time, just to see if you'll tap again to register a double tap. Only after that time does it register a single tap and tell you what's under your finger. This makes tapping slow! And in normal navigation, it's fine. But in typing, it slows us down a lot! Not only because we have to wait for a fraction of a second for it to register a single tap instead of a double tap, but we also can't touch type so fast because then it will start registering double taps!
The fix for that is really easy. Just make screen readers ignore the double-tap logic in the keyboard area when the keyboard is up (if you selected any typing mode other than double-tap typing). Because it doesn't need to handle double taps in that area; you just put your finger, and it instantly registers a single tap and tells you what's under your finger. When you lift, it types. That would be great, and I'm very sure it's easy to implement (Oh, how I wish I knew enough Java and Android API to implement that into TalkBack...)
Because Android itself is not slow at all! In fact, it's instant for all I can notice. And you can test that, even with your screen reader (just so you can notice the reader is artificially slowing itself down).
First, focus on an item. Now, double-tap. But you need to make your second tap pretty late, just before the timer ends but not much before it, nor after it. You will notice the double tap registers right after your second tap, instantly if you get the timing right, and this is frustrating. The screen reader is intentionally slowing itself down, without giving us an option to change the preset timer or implementing the easy fix for the keyboard to make touch typing possible and fast!
#Android #AndroidAccessibility #ScreenReader #UserExperience #AccessibilityIssues #Accessibility #Talkback

meatbag,

@evilcookies98 I have no idea what you're refering to. But if you meant turning off explore by touch, then no. Because that turns it off for the intire screan, and unless you're direct-touch typing, it wouldn't help. If you're using the lift to type mode, it wouldn't work with explore by touch disabled.

TheQuinbox, to windows

So are wxWebView's totally inaccessible or am I doing something wrong? I create one with wxWebView::New(this, wxID_ANY, "http://example.com");, and when my app launches I just hear "wxWebView". Can't arrow or browse the page at all except with object nav.

meatbag,

@TheQuinbox When I last played around with it, I remember that you needed to object nav somewhere inside the document and use the shortcut to perform the default action, which will focus this particular section of the document. After that, NVDA will recognize the webview, and you can navigate it just fine. It's very clunky.

meatbag, to python

I've just begun creating my own original, complete remake/rewrite of the audio game Screaming Strike. I'm primarily doing that to experiment with new technologies and venture outside of my comfort zone of libraries (I mean, for a game, we're using WXPython for windowing and keyboard input!). The project is entirely open source, and anyone is welcome to contribute (We're just getting started, but the code is piling up). The goal is to improve it even more than the original game for venting your rage (with hrtf, effects, and various other things so every punch feels 10x more sattisfying!). Come code with me! https://github.com/meatbag93/punch
Ps: I'm just a hobbyist programmer trying to make something enjoyable, so don't expect much :)

meatbag, to windows

Oh... my... god... #Windows, seriously...
So here I am, opening the Task Manager. I sort the processes by CPU usage while struggling through the bulky laggy interface that once upon a time was so damn simple and straightforward. And... Can you guess what I find? The very first process, the one using up more CPU than anything else is... wait for it... none other than our dear taskmgr.exe!
No kidding. The damn Task Manager itself is using more CPU than my bloated web browser, which pretty much exists to eat resources. It outpaces even my IDE and those so inefficient electron apps I've got running!
I remember a time when Task Manager was insignificant yet being super easy to navigate. And now? Hell, it takes an eternity just to launch!
And don't even get me started on the Start menu... honestly, I can't even bear to dive into that disaster. What was once an efficient launcher has been reduced to... well, let's just say it's no longer what it used to be. I meanseriously, Windows! Why? Why would you take a splendidly working piece of software and just casually turn it into something that resembles absolute crap?
#Microsoft #windows #Windows10 #Windows11

datajake1999, to random

I was having a conversation with @BTyson , and we were discussing how eSpeak has been going down hill ever since NG came along. I am curious how wide spread this opinion is, and would especially value input from native speakers of languages other than English. Considering eSpeak is open source, there is nothing stopping someone such as myself from forking the original eSpeak and backporting features that users actually want. The NV Access Klatt fixes come to mind, as well as the variants that were included in NVDA before NG was implemented.

meatbag,

@datajake1999 @BTyson I don't know if the Arabic language existed in e-speak before NG but it absolutely sucks. Like sucks so much that I cannot use it even when I am a native Arabic speaker. Pretty much unintelligible unless you strain your focus and have it slow

devinprater, to random

Oh my gosh y'all now Blind android users are installing this TTS Engine that just sends text to some server and gets back audio. Cause privacy don't matter. Like holy crap y'all. They are trying to cope with the loss of Eloquence because they know phones are going to 64-bit only. Dang.

meatbag,

@devinprater e-speak for life. never regretted loving it. Fast, lightweight, reliable and follows me across all my devices.
Don't think I will understand people's obsession with the proprietary mess that is eloquence where we barely even know who owns it anymore

meatbag, to random

Can anybody comment on how long I should expect to get into the be my eyes virtual volunteer beta? I was on the waitlist for far too long and I literally can't wait

devinprater, to ai

Okay y'all, Bing Chat's image descrption thing is now coming around to mobile! You can't share from the share sheet, but you can save images and do it that way. More precise for better descriptions. can still make stuff up though.

The image you sent is a photo realistic image of a black and orange box of an AMD Ryzen 5 processor. The box is on a gray background and has a large red and orange Ryzen logo on the front. The box also has the AMD logo and the words “Ryzen 5” and “5000 Series Processor” on the front. There is a small white sticker on the front of the box that reads “AM4 Socket” and a small orange sticker that reads “Zen 3 Architecture”.

#Bing #AI #GPT4

meatbag,

@devinprater how do you upload images to it on mobile? Do you do it through the app or what

meatbag,

@devinprater I cannot see any option to upload an image. Can you explain where you would find that option so I could tell if I am looking at the wrong place or I just haven't received the new version yet

meatbag,

@devinprater shame it appears as if this update hasn't reached me yet. I am excited for it though because I have way too many uses for it

simon, to random

A robot that converses using GPT would become more "tired" as its conversational context grew larger, meaning its "energy level" would be based entirely on socialization. It would go to sleep when its context was full, and the sleep process would consist of summarizing the day's conversation and integrating it into the existing summary, which is a bit like a long-term memory.

meatbag,

@simon when we have the good enough technology we should have these robots but instead of summarizing they would just fine tune themselves on all the data of their latest interactions so that they literally remember everything, perhaps with a garbage collector that deletes extremely old memories like humans... Fine tuning also contributes to their uniqueness in the long term because they have more and more data about the people that they usually interact with and can personalize themselves as they go.

meatbag, to gpt

Please stop trusting #zeroGpt or other #GPT detectors. The thing with #AI generated text is that it's text and text is just text. There are very limited ways to express anything in text. what you write and what an AI writes use the same language, grammar and letters. AI doesn't leave any metadata or something in the text that these detectors can spot.
The current models do have some sort of pattern that they tend to use in their generations, and these detectors more or less look for these patterns. But the thing is that these patterns are not special at all and literally anyone could unintentionally right like an AI without them knowing. Guess what these tools will say when they see their texts? Happily give a ridiculously high rating of this text being written by AI. That's it. These tools literally just take a guess, and their guess is as good as yours. In fact most the time your guess would be much more educated since you know that person and how likely that they will actually be using an AI or if that is their typical writing style.
People blindly trusting these detectors are affecting the lives of many people in false positives.

Note: I hugely oversimplified how these detectors work, you could search on the internet for more information, but there sophistication doesn't make them more reliable. What I said above still holds true

meatbag,

Not to mention that even our current AI is smart enough to literally get around these detectors with clever enough prompts, while very real humans are being falsely flagged. Its captcha all over again...

meatbag, to reddit

These days I constantly see new articles or otherwise talking about people who have deleted their #Reddit comments and threads just to see them reappear a few days later. Wtf? I can't even put into words how wrong this is.
I am genuinely angry because not enough people are talking about this, that not enough people exactly understand what this could imply. I am frustrated because I know Reddit will absolutely come out of this with barely any scratches, because the majority of people really like to complain about Reddit using Reddit as there platform to do so, statistically barely anybody will actually be leaving and this whole thing "will pass", to quote Steve Huffman.
In my book, by continuing to use Reddit, you support their actions and you have a hand in how much worse everything will become. No exceptions. Reddit clearly chose what it wants to be, and you are responsible for choosing to continue after knowing that, supporting them in the process.
The website no longer deserves you. You matter and you are important. Stop Using Reddit from now, you can make a change

devinprater, (edited ) to accessibility

Some blind Android users really want the Eloquence TTS engine back. It will die when 64-bit phones become the norm. They went as far as seriously debating of they could ask phone carriers to step in. It's sad, both because Google could easily have licensed Eloquence, put it in a 32-bit ARM container, and there you go. It's sad that Apple is the only big corporation that spent five minutes and thought "Oh hey we have a license for this now, let's containerize this and ship it for VoiceOver." It's sad that Google doesn't inspire confidence from the blind community at large of Google's ability to uphold an accessible OS and a competitive screen reader. And it's definitely sad that another TTS engine hasn't come along that is any better than Eloquence, which is from the 90's.

#accessibility #apple #google #tts #blind

meatbag,

@devinprater happily using e-speak over here ️☺️

TheQuinbox, to random

Say what you want about Xcode accessibility, or VoiceOver in general, but it has nothing on Android Studio. Hot garbage, both on Mac and Windows. I have to wonder if the person that wrote this accessibility features documentation ever actually used it with a screen reader. F10 doesn’t work, the code folding settings section is just a bunch of unlabeled checkboxes that causes my VO cursor to fly all over the place, pressing done/finish buttons doesn’t work with the VO cursor, you have to tab, and that’s just the tip of the iceberg. I seriously hope I can do this with something like VSCode, because oh god.

meatbag,

@TheQuinbox except you could choose not to use Android studio. Everything is available to you outside it even when it makes things easier. Whereas with Mac and iOS you are pretty much forced to use x code somewhere down the line as far as I understand

meatbag,

@TheQuinbox are you sure that adding custom semantics actions through this widget doesn't work on IOS? Documentation says it should but I don't own an iOS device... https://api.flutter.dev/flutter/widgets/Semantics-class.html
Unless there is another type of action that IOS supports but I don't know about. In case this one doesn't work I'd consider opening an issue, it might be worth it

meatbag,

@TheQuinbox I thought custom semantics actions map to these but I haven't tested them so fair enough

simon, to random

deleted_by_author

  • Loading...
  • meatbag,

    @simon yeah just dump the APK from your phone. Keep in mind that when your friend installs the APK they will lose updates to this app through the Google Play store until they uninstall and reinstall it through the Play store

    meatbag, to random

    I should Polish and publish my flutter notes. I wrote a pretty big collection of notes regarding some widgets and properties and how they affect the layout, from a blind person's perspective, most about ideas such as main and cross axis alignments that I struggled to make sense of since I can't actually see the screen, but I understood after experimenting, and also with the help of a few wonderful people from the flutter community who literally wasted hours describing the how and why of things. Of course they wouldn't magically make you able to make visually captivating apps but I think they help a good deal to have you position and size your stuff properly which even helps screen readers with explore by touch, as a blind programmer. Definitely need to organize first because they are currently all over the place since I originally wrote them for myself and didn't expect them to grow that big

    meatbag, to random

    I'm generally not feeling it today from the moment I woke up, so this post is going to be out of ordinary. A lot of past events I either read about or witnessed myself as well as the whole worlwind around Reddit these days, It's during times like these that I can't help but reflect on the fragility of human nature and our susceptibility to be pulled into the black hole of wealth and power. It's a terrifying thought, really. The idea that too much money and fame could change a person, turning them into someone unrecognizable, someone who forgets the importance of genuine connections and the people who have supported them along the way.
    I'm just a regular human being, and the thought of having too much money and fame scares the living daylights outta me. I mean, seriously, I'm terrified that I'll become this money-crazed, fame-hungry version of myself who doesn't give a damn about the people who appreciate me or my work.
    So
    I don't wanna be the kind of person who only cares about money, brushing off the folks who appreciate my stuff and helped me get to where I am. That's not who I wanna be, and it sure as hell ain't the kind of legacy I wanna leave behind.
    That's why I'm laying down the law right now. Starting from this very moment, all my projects are gonna be free as in freedom, open source, the whole shebang. I wanna protect people from that possible future version of me—the one who's all about the money and doesn't give a damn about anything else.
    I'm making a promise to myself and to anyone who's ever supported me: I'll never let greed or fame blind me. I won't let the cash flow turn me into some heartless, money-grubbing jerk. No way!
    I'm gonna stay true to my roots, keep my projects accessible and open to all, and show some love and appreciation to the people who've got my back. I'm not just gonna take their money and then toss 'em aside like yesterday's news. That's not how I roll, and it's not the kind of person I wanna become.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • provamag3
  • InstantRegret
  • mdbf
  • ethstaker
  • magazineikmin
  • GTA5RPClips
  • rosin
  • thenastyranch
  • Youngstown
  • osvaldo12
  • slotface
  • khanakhh
  • kavyap
  • DreamBathrooms
  • JUstTest
  • Durango
  • everett
  • cisconetworking
  • Leos
  • normalnudes
  • cubers
  • modclub
  • ngwrru68w68
  • tacticalgear
  • megavids
  • anitta
  • tester
  • lostlight
  • All magazines