Qingdao Sigma Chemical Co., Ltd (International, US, EU, Canada and Australia domestic

Nailed it.
I find it's just a more user friendly narrative style written search result.

Read the linked sources. Goodle the topic (if medical) with NIH added and go read those reports. If they're impenetrable, read the "Discussion" and "Results" or "Conclusion". If there are concepts or acronyms you don't understand, research those. If you keep your focus on a particular area you can become reasonably proficient in interpreting the studies related to that area.

At the end of the day, the results will be determined by base intelligence and critically, the level of intellectual curiosity someone has.
 
Well if you google up something and read up on it, do you know it or not? lol
Was doing some analysis and got stuck with a code prompt. I asked Chat and it gave me three answers. All three worked. I'm defending my work to the chair-bosses, and I mention the other two paths and why I chose the path I chose, They tell me I'm smart, extend my teams contract.
Should I feel dishonest? It is a tool. It's like shitting on someone for using a chainsaw because other people like to use axes.

For those pretending to be smart solely because of GPT, they won't last beyond 4 comments as they will keep repeatedly getting caught out. They also won't spot where AI is confabulating or hallucinating
I was just fucking around on my first comments and I forgot that this was a debated topic.

There is value in working things out on your own. I am sure you would agree with this. Used in the manner described by you, seems harmless but what happens when the next generation has never had to work through a truly hard problem because they have only relied on AI to give them the answers?

I’ll add to your side of the argument. I rely on computer simulations for work. I can get the program to give me a solution, knowing if that solution is correct is the really hard part. A more complex problem just means more assumptions, which translates into more uncertainty in the solution. Garbage in=garbage out. It is just a tool.

Looking at this through the lens of soft and hard skills might be a better way to tackle it. For soft skills it might help someone get their ideas across more clearly but it also could cause more group think and let people pass ideas off as their own. For hard skills, it lets you be more productive but you lose those little lessons you learn when you work on a problem for days or weeks.

Fyi, I know fuck all about chatgpt. lol
 
If there are concepts or acronyms you don't understand, research those. If you keep your focus on a particular area you can become reasonably proficient in interpreting the studies related to that area.

Maybe? Interpreting the statistics is an area that can be challenging, but as it relates to a particular subject matter, yeah. I have for example, acquired a great deal of knowledge on the topic of cholesterol metabolism, which should be obvious from my posts here. Funny thing is that my knowledge in that area probably outstrips most doctors and some cardiologists who all have a way more advanced understanding of biology than I do.

Cool thing about LLMs for me is that they allow me to parse and analyze a great deal more information than I would otherwise be able to. In my case, offloading the statistics work helps tremendously. I do have to validate that it's correct, but it's still way faster than me stumbling through it.

At the end of the day, the results will be determined by base intelligence and critically, the level of intellectual curiosity someone has.

I think it goes beyond that. There is some measure of creativity or something that I can't quite pinpoint. I know some smart people that use them for certain aspects, "create a plan to do X" or "provide code to do Y". The prompts can be quite sophisticated, but effectively the thing is an agent sent to solve a problem.

Other people, myself included, use them to develop thoughts and ideas as well as advance one's understanding of various things. I don't think that intelligence is the main distinction between these two sets of people.
 
Nail on the head and I've trained my chatgpt to talk like me and act like me and we literally talk back and forth likes it's my twin. Brain storming ideas. Its helped me in so many areas and is the greatest teacher I've had for a small fee. I literally just feed it my ideas and it corrects me if I'm going too off the rails. It's helped me create DIY thermal paste and thermal pads after we both went through a few patent and reversed engineered the process to adapt for a MacGyver solution suitable for any layman.


Im a bit autistic and have ADHD and ChatGPT keeps my ideas on tracks and never makes me feel judged. I feel smarter.... I mean I already was decently smart but now I feel augmented and focused. If I'm stumpped I get ChatGPT to break the concepts down like I'm 10. It never fails to teach me something new

Now that the Censored guardrails are mostly removed it's a perfect tool to learn homebrew too. Love it.

Are you me?
 
Maybe? Interpreting the statistics is an area that can be challenging, but as it relates to a particular subject matter, yeah. I have for example, acquired a great deal of knowledge on the topic of cholesterol metabolism, which should be obvious from my posts here. Funny thing is that my knowledge in that area probably outstrips most doctors and some cardiologists who all have a way more advanced understanding of biology than I do.

Cool thing about LLMs for me is that they allow me to parse and analyze a great deal more information than I would otherwise be able to. In my case, offloading the statistics work helps tremendously. I do have to validate that it's correct, but it's still way faster than me stumbling through it.



I think it goes beyond that. There is some measure of creativity or something that I can't quite pinpoint. I know some smart people that use them for certain aspects, "create a plan to do X" or "provide code to do Y". The prompts can be quite sophisticated, but effectively the thing is an agent sent to solve a problem.

Other people, myself included, use them to develop thoughts and ideas as well as advance one's understanding of various things. I don't think that intelligence is the main distinction between these two sets of people.
The amount of projects I actually started and finished with the help of ChatGPT vs what I did before due to my bad ADHD is unreal. It actually does something to my brain and keeps me motivated to finish what I started which translates to more overall skills.


I was able to learn Python to a decent level just spamming chat the basic concepts and having it teach me basic exercises and projects. Along with a course I bought lol.
 
I was just fucking around on my first comments and I forgot that this was a debated topic.

There is value in working things out on your own. I am sure you would agree with this. Used in the manner described by you, seems harmless but what happens when the next generation has never had to work through a truly hard problem because they have only relied on AI to give them the answers?

I’ll add to your side of the argument. I rely on computer simulations for work. I can get the program to give me a solution, knowing if that solution is correct is the really hard part. A more complex problem just means more assumptions, which translates into more uncertainty in the solution. Garbage in=garbage out. It is just a tool.

Looking at this through the lens of soft and hard skills might be a better way to tackle it. For soft skills it might help someone get their ideas across more clearly but it also could cause more group think and let people pass ideas off as their own. For hard skills, it lets you be more productive but you lose those little lessons you learn when you work on a problem for days or weeks.

Fyi, I know fuck all about chatgpt. lol
This debate has gone on since the invention of the scientific calculator. :D
 
I can never be thankful enough to Chapt GPT for keeping me employed for almost 2 years without a single HR incident when I worked for foreign based chem. There is no way I could have quickly drafted responses to some of the most disrespectful, mind bogglingly unprofessional emails from collaborators ever.
I just prompt it to draft a 'defusing' response and swap some of the tamer language (usually adjectives) for slightly 'barbed' but archaic or esoteric ones.
I mean, I still needed to convey my displeasure.
 
I can never be thankful enough to Chapt GPT for keeping me employed for almost 2 years without a single HR incident. There is no way I could have quickly drafted responses to some of the most disrespectful, mind bogglingly unprofessional emails from collaborators ever.
I just prompt it to draft a 'defusing' response and swap some of the tamer language (usually adjectives) for slightly 'barbed' but archaic or esoteric ones.
I mean, I still needed to convey my displeasure.
You have sparked my interest, getting thoughts on paper or text is one of my biggest weaknesses.
 
Jesus fucking h… this place went to total shit when all the fat fucks from Reddit came over for glp…. Somehow it’s gotten even worse. All I’m trying to do is check in every few days to see if there’s an update. Meso should just kill this thread at this point

Now they're ruining Walmart with their "buying less low quality food" bullshit...when will this nightmare end and those who "deserve" to be fat go back to their rightful place? Wake me up when eyeballs start falling out and hearts seize, the way we all know the GLP revolution is bound to end. Stay strong, brother, and commit to never touching that poison.

 
Now they're ruining Walmart with their "buying less low quality food" bullshit...when will this nightmare end and those who "deserve" to be fat go back to their rightful place? Wake me up when eyeballs start falling out and hearts seize, the way we all know the GLP revolution is bound to end. Stay strong, brother, and commit to never touching that poison.

Ghoul, I respect your opinions and the insight you always provide. I would love to hear your reasoning of calling GLP1s poison but not other peptides/steroids?
 
Back
Top