DEV Community

Cover image for AI generated the 90% of this post
JoelBonetR 🥇
JoelBonetR 🥇

Posted on

AI generated the 90% of this post

The heading image has been generated with Dall·E 2

In the previous post

I explained these concepts in a way that I felt "enough understandable" for a wide range of people, though I can be wrong because it's my point of view at the end, so I've decided to try and make Chat GPT to explain what it does.

The conversation went like that:

Me:
can you explain me the following function

export const pipeAsync: any =
  (...fns: Promise<Function>[]) =>
  (input: any) =>
    fns.reduce((chain: Promise<Function>, func: Function | Promise<Function> | any) => chain.then(func), Promise.resolve(input));
Enter fullscreen mode Exit fullscreen mode

Chat GPT:

This function is creating a utility function called "pipeAsync", which allows chaining multiple asynchronous functions together in a pipeline.

It takes a list of functions as input, passed as rest parameter, which are all wrapped inside a promise.

It returns a new function that takes an input value as an argument. This returned function uses the Array.prototype.reduce method to iterate over the array of functions, and calls each function in sequence, passing the result of the previous function call as input to the next function.

The reduce function starts with an initial value of Promise.resolve(input), which is the input passed to the returned function. This is the starting point of the pipeline.

The reduce function then calls the next function in the pipeline with the result of the previous function call, and this process continues for each function in the pipeline.

The result of the reduce function is a new promise that resolves with the final result of the pipeline.

The "pipeAsync" function allows you to chain together multiple asynchronous functions in a way that makes the code more readable and maintainable. It is similar to the "pipe" function from functional programming, but it's adapted to work with async/await.


While it is 100% accurate, the language and language resources it uses may require a bit of focus to understand in my opinion.

What do you think? Is it better explained by GPT or by flesh and bones people? Let me know in the comments!

Latest comments (16)

Collapse
 
ventolin profile image
ventolin

Some people gonna have to start writing better posts if they don't want to be replace by AI.

Collapse
 
joelbonetr profile image
JoelBonetR 🥇

I already tried it on detail and we still need to wait a bit more till this situation is present, so we have a bit more of time to train ourselves! 😂

Collapse
 
ventolin profile image
ventolin

😂

Collapse
 
polterguy profile image
Thomas Hansen

Hi Joel, if you build your own model it'll typically do a much better job :)

Collapse
 
joelbonetr profile image
JoelBonetR 🥇

Thanks @polterguy I'll take a look at it 😀

At first glance I doubt I can create "easily" a good enough training model to compete with Chat GPT though 😂😂

Collapse
 
polterguy profile image
Thomas Hansen

Of course not, but ask ChatGPT "who is Joel Bonet", then realise every single company out there might want a chatbot answering similar questions, such as "What's the price of your products?" or "Why should I chose your company?", etc, etc, etc.

If you ask the publicly available ChatGPT these questions, it will (at best) try to sell you a pro OpenAI account.

Collapse
 
hr21don profile image
Helitha Rupasinghe

AI generated this comment!

Collapse
 
codewithbernard profile image
Bernard Bado

AI understands code better than developers.
That's why I've created WhisprAI - code review assistant.

Collapse
 
kunashir profile image
Aleksandr Korolev

I think it is true in regard to the technical side but AI still has difficulties when it is related to the business side.

Collapse
 
codewithbernard profile image
Bernard Bado

For now

Collapse
 
aneshodza profile image
anes

It's actually crazy how it's hard for us humans to tell when an article was written by AI and when it wasn't. I did the same thing but to a bigger extend, which was documented here if you are interested

Collapse
 
blindfish3 profile image
Ben Calder

So far I'm fairly confident I can tell when AI has been used. It's pretty sad if people's reading ability and discernment are so poor that they airways fall for it; though too many people fall for the lies spouted by politicians, so go figure...
From what I've seen so far AI generated text is usually highly descriptive; verbose and has no personality. It avoids expressing an opinion, but instead presents all possible sides of an argument. It's soulless and boring. Just what you would expect...

Collapse
 
joelbonetr profile image
JoelBonetR 🥇
AI generated text is usually highly descriptive; verbose and has no personality. It avoids expressing an opinion, but instead presents all possible sides of an argument.

So at the end it outputs what any adult discussion should look like (at least in the science-related world, though it won't be true in yellow journalism for example) 😅

Collapse
 
aneshodza profile image
anes

@blindfish3 @jonrandy sometimes it is really obvious, don't get me wrong. But as someone that picked up english as his 3rd language I would lie if I told you that I can always pick up on it.

Collapse
 
jonrandy profile image
Jon Randy 🎖️

Mostly it isn't that hard - they always have a certain 'feel' to them... whenever I'm suspicious I run it through a GPT detector and my hunch is usually correct

Collapse
 
joelbonetr profile image
JoelBonetR 🥇

Hi @aneshodza thank you for sharing! I'll take a gander for sure 😁