DEV Community

Cover image for Search Social Networks for free username - Moriarty part-1

Posted on • Updated on

Search Social Networks for free username - Moriarty part-1

You can find this gem on GitHub and rubygems. Sharing is the best way to donate for my development. This is part of multiple series content about moriarty gem and it's creation, from script to gem. It's intended for beginners

GitHub logo decentralizuj / moriarty

Moriarty - Tool to check social networks for available username

  • GIF preview v-0.2.0 - without latest changes'

Moriarty Preview

Gem is still in alpha, many things will be added (like scrapping info from social networks) Breaking changes are highly possible, so I do not suggest use of this gem except for development and test That means, do not use it in production services, otherwise feel free to play with it.


Tool to check social networks for available username.
Idea from python tool - sherlock

What it do

Search multiple social networks for free username It's reverse from Sherlock, so not-found username is success. Argument --hunt will run search like sherlock would, looking for valid users.

How to install

Clone repo and install dependencies:

# Moriarty use rest-client, nokogiri and colorize gems
 git clone && cd moriarty && bundle install
Enter fullscreen mode Exit fullscreen mode

Or install from rubygems:

DISCLAIMER GitHub repository is updated before Rubygems. Before v-1.0.0 is released, recommended way is to clone repo,…

After using Sherlock for a long time (like it!), I decided to create something similar in Ruby. This give me more power to extend software with new features easily, like scrapping info from website, comments from FB, tags from photos etc... For me, ruby is much over python (because of dev happiness, and many ways to do the same thing). When I say "over" python, I talk about personal experience, not about Ruby VS Python

Start Project

What script should do:

  • take username
  • take website url
  • submit request
  • receive response

For http request I'll go with gem rest-client, and here's why. RestClient response is used to scrap page html data with Nokogiri::HTML method. If we use Net::HTTP#get instead of RestClient#get we will get all positive results. But with rest-client in use, nokogiri fail if there's no page to scrap, even if response code is 200. So here I used rescue to set @success to false. This do not work for all sites, so I need to check scrapped data in future.

#!/usr/bin/env ruby

require 'rest-client'
require 'nokogiri'

class Moriarty

  # initialize new object, set user and url
  # add https and '/' to the end

  def initialize( name = '', site = '', type = :https )
    @user = name.to_s
    @url  = type.to_s + '://' + site.to_s
    @url += '/' unless @url.end_with?('/')

  # make request to url
  # accept optional options hash

  def go( opt = {} )
    # use initialized data or enter new
    opt[:user] ||= @user
    opt[:site] ||= @url  

    # construct url from sitename and username
    uri = opt[:site].to_s + opt[:user].to_s

    # make request with rest-client
    @response = RestClient.get uri

    # get page html with nokogiri
    @html = Nokogiri::HTML @response

    # if everything is fine, set and return true
    # otherwise, set and return false
    return @success = true
    return @success = false
Enter fullscreen mode Exit fullscreen mode

Part of job is done. But we must somehow access this variables, so we need to define attr_reader (we will create setter methods manually).

class Moriarty
  attr_reader :url, :user, :response, :html, :success?
  # ... ...
Enter fullscreen mode Exit fullscreen mode

Now we can use Moriarty class like this:

# This way we can check GitHub, dev, instagram, but we can't
# check linkedin etc... This is because those pages return
# data to nokogiri to scrap, even if no user

  @jim = 'moriarty', '' )

  if @jim.success ?
    puts "#{@jim.user} is registered on #{@jim.url}"
    puts "Username #{@jim.user} seems to be free on #{@jim.url}"
Enter fullscreen mode Exit fullscreen mode

This post is part of multiple series. In next part I will add all other methods from gem, and in third I will create ruby gem. Last, but not least will explain CLI interface use.

With next updates I will add methods to scrap from html and check is user registered or is false-positive. If registered, data will be saved in some kind of database (I like CSV for this kind of data, or just a .txt)

Next series will come soon, in next hours...

Top comments (0)