Thanks to everyone that voted:
https://qoto.org/@post/110207911042352184
Now #Logseq is even going to have ChatGPT in the sidebar and conversation saved in /chats in your graph folder.
Logseq has made a fool of us privacy-conscious people.
I will no longer support it financially and I won't recommend it anymore.
If someone will maintain a patched version that is fully FOSS and without built-in OpenAI services (kinda like Bromite with Chrome) on both FlatHub and F-droid repos can have 5$/month from me and hopefully from other privacy-conscious #PKM enthusiasts.
No, PKM + ChatGPT is too big of a threat for people in general. Logseq would send big chunks of personal information to a company that can do whatever they want with that information, even providing it to other users.
In Italy the court that is responsible to protect people privacy blocked ChatGPT for their blatant violation of privacy laws and in theory other European countries should do the same because of the GDPR.
People are not aware that what they write to ChatGPT is to be considered public and associated to their profile like in a social media platform. And it is way worse than Twitter, Facebook, Google etc that at least have GDPR-compliant privacy policies.
@post Understood. I also know that Germany is considering to follow Italy for the ban. However, it's fine for me for a company to provide the AI integration as long as my data is safe if I keep the feature off. The AI integration is appealing to a portion of users, and the users should educate themselves on privacy issues of OpenAI. #Logseq should not be the gatekeeper here. I think it's similar to the community plugins, and users are responsible to assess their own risk tolerance.
Thank you for letting me know.
Users should educate themselves on privacy? No, this is not like using Google or Facebook, this is on a totally different level: OpenAI is officially criminal in Italy.
When a person uses a service it is assumed to be private by default and if something is shared you are supposed to accept a (ideally clear) privacy policy. Is this the case with OpenAI? No. Will it be the case with Logseq? I fear not.
I just can't accept this hypocritical attitude of giving away responsibilities by fooling people into agreements: the basis of individual sovereignty and freedom is the **informed** consent.
Also here the major issue is Logseq cheating on its users. I sponsored a project whose tagline started with "A **privacy-first**, **open-source** platform..." and I discovered later that it is not fully FOSS (and the closed source module is even the most important one: the one responsible for e2e encryption!) and that it using those money to integrate the biggest threat to privacy ever!
@post A bit surprising. Don't developers involve forum moderators in their schedule at all with regards to community participation and desired features?
Yes but they are totally ignoring the feedback, here there are the details:
@post I already had read that post, Alex. But question was more specific: Neglecting random dudes on the internet is a different thing from ignoring mediation staff actively trying to communicate features, in case developers don't have time to look at the voting system. They really should work on their communication, if neither gets attention.
I'm not sure if you are replying to me but just in case when I said "they ignore feedback" I was of course not referring to posts here but to the Feature Requests category that they setup with votes.
@post I see what you mean #logseq