You got your hands on some data that was leaked from a social network and you want to help the poor people.
Luckily you know a government service ...
For further actions, you may consider blocking this person and/or reporting abuse
PowerShell to the rescue!
$json = invoke-webrequest 'gist.githubusercontent.com/jorinvo...' | convertfrom-json
$json | select name,creditcard | export-csv "$(get-date -format yyyyMMdd).csv" -NoTypeInformation
Excellent, man
ramda-cli:
scala:
A oneliner if you're a linuxer 😉
However, there is something you have not mentioned in your post: Should the CSV file have the header line?
If yes, then use this:
This adds quotes.
Maybe adding this sed command:
Doesn't the second solution need a
>>
in the last line, so the output is appended?Yes, it does. (Didn't copy the correct version)
Thanks ☺
Aaaand Rust :)
Really an overkill for this task but fun nevertheless!
PHP:
You beat me to the PHP implementation. And your solution is so elegant.
Since the input JSON could be really large, here is a Node.JS steaming version (using stream-json package):
Nice! There is also csv-write-stream then you can save some code :)
Using the CSV module to avoid any quoting pitfalls. :)
Ruby is still one of the most pretty languages!
Maybe you can use the
open(url).read
fromrequire 'open-uri'
instead ofcurl
to allow it to run on other systems 🙂Alernatively could look like this:
Oh, I like that!
open-uri
built-in. Also awesome.Oneliner:
A few things to note:
cache
is a program I wrote that caches command-line invocations, it's to make it cheap to iterate (e.g. so you don't have to hit the network each time) github.com/JoshCheek/dotfiles/blob...My shell is fish (fishshell.com) which allows multi-line editing, and the parentheses in fish are like backticks in bash, so the
> (...)
is redirecting the output into a file whose name is the result of the...
Nice post!
All users share the same date. So I didn't bother and didn't write into separate files.
Another thing, I was going to write "Hey, that's not valid json you are giving us.", because I saw the objects are in a list and that list is not wrapped into an outer object. But my Python parser did not complain, so it turns out valid. You learn something new every day.
Having arrays on the top-level of JSON documents is indeed valid although it is definitely an anti-pattern. By doing so you block yourself from adding any meta information in the future.
If you build an API, you always want to wrap an array in an object. Then you can add additional fields like possible errors or pagination later on.
e.g.
Personally, I'd prefer the array in most cases. If I call an endpoint called customers, I would expect it to return an array of customers, not something that contains such an array, might or might not have an error and so on.
If I want to stream the response, I'd also be better off with an array, because whatever streaming library I use probably supports it.
Seems like json can have an array at the root, even according to the first standard: tools.ietf.org/html/rfc4627, section 2
R
I set myself a time limit of 15 minutes, with no google. I did not know how to download using python, so i used wget or powershell. The rest is straight forward.
A vanilla Node.JS version:
Well, at work I would use a tool called "IBM Transformation Extender", which is specialised on data transformation. It breaks the job down into 3 tasks:
...and in f_record() one would simply drag'n'drop the name and the credit card fields from the input to the output.
Not the cheapest solution, obviously, but its maintainability is great if you have hundreds of these mappings.
Since I started learning Ruby this week my solution written in it :D
Thanks for another great challenge Jorin :)
Almost all (except 2 at this time) submission writes csv by hand, not using library. The output will not be valid if a value contains , or "
True. I have not thought of that.
I open the csv in LibreOffice, to make sure it comes out fine, but with really big files, it might not be possible.
Perl 6? :
Of course in reality you'd probably want to use Text::CSV to properly format the CSV output in order to handle quoting and escaping properly.
Or if you guys line nasty oneliners (requre statements don't count)
I'm trying to do it in Elixir now :D
awk:
Just leaving a note here for everyone that would like see more tools and solutions. Checkout the original CSV Challenge.