Python can be used to automate tasks. One of the tasks that can be automated easily is web-scrapping and clicking on button on web pages.
This is an example program for writing a simple python code. We will follow steps below to find a proxy server (since we don't want to web server to find out we are clicking thousand times a second), and then we will use beautifulSoup to scrap the web page, find the button and then push that button.
1) Find the proxy
To find the proxy, we will do:
url = "https://free-proxy-list.net"
response = requests.get(url)
soup = BeautifulSoup(response.text, "html.parser")
# result = soup.findAll('tr')
result = soup.find('table')
result = result.findAll('tr')
for _ in result:
print(_)
root = ET.fromstring(str(_))
proxy_ip = root[0].text
proxy_port = root[1].text
if proxy_ip != "IP Address":
proxy = 'https://' + proxy_ip + ':' + proxy_port
os.environ['https_proxy'] = proxy
This code above, will use a list of proxies to find IP and Port of proxy server.
2) Visit the website
driver = webdriver.Safari()
driver.get("https://youtu.be/blahblahblah")
elem = driver.find_element_by_class_name("ytp-large-play-button-bg")
print(elem.get_attribute("fill"))
driver.close()
Here we are using web driver to browse a youtube channel. Each time the youtube channel will be browsed with a different IP address
Comments
Post a Comment