How to Get DA 94 Google Backlinks

increase your da for free hack

So what I am going to show you today is a black hat technique used to gain 100’s of backlinks from Google domains completely free of charge. Also, this will only take you about 5 minutes and will not require any software. Let’s get started.

  • Copy the URL’s below to your clipboard and paste them here:
  • In the “Find this:” field, enter, and in the “Replace with:” field enter the URL you would like a backlink for. Remember, you must include the https:// Then click the “Find and Replace Text” button.
  • Go to and paste your results from the previous step and click the “rapid index” button. Keep this page open until the indexing has finished.

That’s it! How simple was that to get 299 Google domain backlinks. These will all eventually get indexed and increase your websites DA/PR Rank.

This hack will only allow you to build 1 unique URL to each domain, however, you can submit as many different URL’s as you like.

Just a quick video to show you the moves.

Free Instagram Image and Video Scraper

Instagram Scraper

Instagram is a great source of free images and clips for your SEO projects. This Instagram scraper has been developed by arc298 from Github and is available free of charge with unlimited use. This is a python script that is very easy to configure and use.

Whenever I use Python scripts, I always use Ubuntu in a virtual machine. The main reason is that I find it a lot easier to run Python scripts in Ubuntu and because I can keep all of my scraping scripts in it with all of the instructions in the one place. Furthermore, it can be running tasks in the background and not bother me while I’m doing other things on my computer.

So, assuming you are running a fresh installation of any variation of Ubuntu, these are the instructions for installation and use.


To install instagram-scraper:

$ pip install instagram-scraper

To update instagram-scraper:

$ pip install instagram-scraper --upgrade

Alternatively, you can clone the project and run the following command to install: Make sure you cd into the instagram-scraper-master folder before performing the command below.

$ python install

how to use

To scrape a user’s media – all images and videos:

$ instagram-scraper <username> -u <your username> -p <your password>  

without the < > characters

NOTE: To scrape a private user’s media you must be an approved follower.

By default, downloaded media will be placed in /.

Providing username and password is optional, if not supplied the scraper runs as a guest. Note: In this case, all private user’s media will be unavailable. All user’s stories and high-resolution profile pictures will also be unavailable.

To scrape a hashtag for media:

$ instagram-scraper <hashtag without #> --tag 

It may be useful to specify the –maximum <#> argument to limit the total number of items to scrape when scraping by hashtag.

You can also supply a file containing a list of usernames:

$ instagram-scraper -f ig_users.txt       
# ig_users.txt


# and so on...

The usernames may be separated by newlines, commas, semicolons, or whitespace.

You can also supply a file containing a list of location ids:

$ instagram-scraper --tag <your_tag_here> --include-location --filter_location_file my_locations.txt  
# my_locations.txt


# and so on...

The resulting directory structure will be;

├── some_reagion1
│ └── images_here
└── some_reagion2
└── images_here

The locations can only be separated by newlines and spaces.


--help -h               Show help message and exit.

--login-user  -u        Instagram login user.

--login-pass  -p        Instagram login password.

--followings-input      Use profiles followed by login-user as input

--followings-output     Output profiles from --followings-input to file

--filename    -f        Path to a file containing a list of users to scrape.

--destination -d        Specify the download destination. By default, media will 
                        be downloaded to <current working directory>/<username>.

--retain-username -n    Creates a username subdirectory when the destination flag is

--media-types -t        Specify media types to scrape. Enter as space separated values. 
                        Valid values are image, video, story (story-image & story-video), broadcast
                        or none. Stories require a --login-user and --login-pass to be defined.
--latest                Scrape only new media since the last scrape. Uses the last modified
                        time of the latest media item in the destination directory to compare.

--latest-stamps         Specify a file to save the timestamps of latest media scraped by user.
                        This works similarly to `--latest` except the file specified by
                        `--latest-stamps` will store the last modified time instead of using 
                        timestamps of media items in the destination directory. 
                        This allows the destination directories to be emptied whilst 
                        still maintaining history.

--cookiejar             File in which to store cookies so that they can be reused between runs.

--quiet       -q        Be quiet while scraping.

--maximum     -m        Maximum number of items to scrape.

--media-metadata        Saves the media metadata associated with the user's posts to 
                        <destination>/<username>.json. Can be combined with --media-types none
                        to only fetch the metadata without downloading the media.

--include-location      Includes location metadata when saving media metadata. 
                        Implicitly includes --media-metadata.

--profile-metadata      Saves the user profile metadata to  <destination>/<username>.json.

--proxies               Enable use of proxies, add a valid JSON with http or/and https urls.
                        Example: '{"http": "http://<ip>:<port>", "https": "https://<ip>:<port>" }'

--comments             Saves the comment metadata associated with the posts to 
                       <destination>/<username>.json. Implicitly includes --media-metadata.
--interactive -i       Enables interactive login challenge solving. Has 2 modes: SMS and Email

--retry-forever        Retry download attempts endlessly when errors are received

--tag                   Scrapes the specified hashtag for media.

--filter                Scrapes the specified hashtag within a user's media.

--filter_location       Filter scrape queries by command line location(s) ids

--filter_location_file  Provide location ids by file to filter queries 

--location              Scrapes the specified instagram location-id for media.

--search-location       Search for a location by name. Useful for determining the location-id of 
                        a specific place.
--template -T           Customize and format each file's name.
                        Default: {urlname}
                        {username}: Scraped user
                        {shortcode}: Post shortcode (profile_pic and story are empty)
                        {urlname}: Original file name from url.
                        {mediatype}: The type of media being downloaded.
                        {datetime}: Date and time of upload. (Format: 20180101 01h01m01s)
                        {date}: Date of upload. (Format: 20180101)
                        {year}: Year of upload. (Format: 2018)
                        {month}: Month of upload. (Format: 01-12)
                        {day}: Day of upload. (Format: 01-31)
                        {h}: Hour of upload. (Format: 00-23h)
                        {m}: Minute of upload. (Format: 00-59m)
                        {s}: Second of upload. (Format: 00-59s)

                        If the template is invalid, it will revert to the default.
                        Does not work with --tag and --location.

If you are not familiar with Python and or Ubuntu, just be patient. It is really not that hard, just give it a go!


What Is The Best SEO Software?

Best SEO Software

By far the best all-round SEO software is WordPress installed locally on your computer. Now I know what you’re probably thinking “WordPress, isn’t that for websites?”. We’ll yes, but over the years it has evolved to become more of an online content manager with a multitude of plugins.

So now we need to take a deep dive into this to explain further why WordPress is the best free SEO software.

So, as I had mentioned, WordPress has a lot of plugins, some of which can be used to scrape content from the internet and publish this to your WordPress website.

One of the best, if not the best plugin for scraping content from the internet is “WordPress Automatic Plugin”. This plugin is very good at scraping websites, garbing the articles and pictures and saving them to WordPress. Watch this video from the makers of this plugin to get a better idea of what I’m on about.

There are no doubt more plugins but nothing comes close. It’s all I use for automatic content creation. And for $30 you simply cannot get anything better.

You can run this plugin on a web hosted version of WordPress, but you will run into issues when you start creating a lot of content campaigns. The WordPress Automatic Plugin will use a lot of CPU & ram resources that most hosting providers can only provide if you go for a dedicated server. This will cost anywhere from $50/month.

This is one of the main reasons why you will want to host WordPress locally on your computer. A WordPress install on an old computer which would most likely have enough CPU and ram for your SEO projects. And the average household internet connection will be more than enough to power very large campaign’s.



Now that you have an automated way to generate content, you will need to have something in place to publish the content to a network of websites.

Again for this there are many different options, some better than others. But my first choice is from SNAP from nextscripts. They have free version that in and of it’s self is really good. However, when you start getting serious with content syndication, it is recommended to buy the pro plugin.

SNAP can currently post to the following netwroks;


Social Networks
Facebook – Autopost text, image or share a link to your profile, business page, community page, or Facebook group
Google+ – Post text, image or share a link to your profile, collection, business page or community
Instagram – Upload your blogpost’s image to your Instagram account.
LinkedIn – Post text, article, image or share a link to your profile, group, or company page. – Autopost text, image or share a link to your profile or group
Plurk – Autopost to your account. Ability to attach Image to messages
Pinterest – Post your blogpost’s image to your Pinterest board.
Twitter – Autopost to your account. Ability to attach images to tweets
VK.Com – Post text, image or share a link to your profile or group page
Weibo – Biggest Chinese Microblogging Service. You can post your messages and images
XING – Post text messages, images or share links.
Blogs/Publishing Platforms
Blogger – Autopost to your blog. HTML is supported
Flipboard – Autopost to your magazines
Google My Business – Create Google My Business post
Instapaper – Autopost to your account
LiveJournal – Auto-submit your blogpost to LiveJournal blog or community. LiveJournal engine based website is also supported
Medium – Autopost to your profile or publications
Scoop.It – Autopost to your “Topics”. Ability to attach your blogpost to scoop. Ability to make “Image” posts
SETT – Auto-post to your blog
Tumblr – Create a text post, image post, audio or video post on your Tumbler blog. HTML is supported.
WP Based Blog – Auto-submit your blogpost to another WordPress based site. Support for any standalone WordPress and,, etc..
Link Sharing/Boormarks
Diigo – Auto-submit bookmark to your account
Reddit – Autopost to your subreddits
Email Marketing
MailChimp – One of the most popular email marketing tools. You can send your blogposts as email campaigns to specific subscribers
Line – Autopost texts, images, or links to your channel, group or chat
Telegram – Autopost texts, images, or links to your channel, group or chat
Yo – Send notifications to your subscribers
Image Sharing
deviantART (!) – Autopost to your blog. HTML is supported
Flickr – Autopost images to your photostream and/or sets. Tags are supported
vBulletin – Auto-submit your blogpost to vBulletin forums. Could create new threads or new posts
YouTube – Post messages to your YouTube channel feed. Ability to attach existing YouTube videos to posts


The 2 different categories of WordPress plugins, Content Creation and Syndication are 2 core features that are a must have to get your WordPress installation to become an SEO monster. I will add more useful plugins to ad more SEO features for you, but for now this will get you well on your way.

To install a locally hosted version of WordPress on your computer, just check out my tutorial on this here.

It’s very easy and can be done on any type of operating system.