pypipython-3.x95% confidence\u2191 18

TypeError: An asyncio.Future, a coroutine or an awaitable is required

Full error message
I'm trying to make an asynchronous web scraper using beautifulsoup and aiohttp.This is my initial code to start things.I'm getting a [TypeError: An asyncio.Future, a coroutine or an awaitable is required] and having a hard time figuring out what is wrong with my code.I am new to python and would appreciate any help regarding this.

import bs4
import asyncio
import aiohttp

async def parse(page):
    soup=bs4.BeautifulSoup(page,'html.parser')
    soup.prettify()
    print(soup.title)

async def request():
    async with aiohttp.ClientSession() as session:
        async with session.get("https://google.com") as resp:
            await parse(resp)

loop=asyncio.get_event_loop()
loop.run_until_complete(request)

Traceback:-

Traceback (most recent call last):
  File "C:\Users\User\Desktop\Bot\aio-req\parser.py", line 21, in <module>
    loop.run_until_complete(request)
  File "C:\Users\User\AppData\Local\Programs\Python\Python38-32\lib\asyncio\base_events.py", line 591, in run_until_complete
    future = tasks.ensure_future(future, loop=self)
  File "C:\Users\User\AppData\Local\Programs\Python\Python38-32\lib\asyncio\tasks.py", line 673, in ensure_future
    raise TypeError('An asyncio.Future, a coroutine or an awaitable is '
TypeError: An asyncio.Future, a coroutine or an awaitable is required

One issue is that loop.run_until_complete(request) should be loop.run_until_complete(request()) - You actually have to call it for it to return a coroutine. There are further problems - like you are passing an aiohttp.ClientResponse object to parse and treating it as text/html. I got it to work with the following but don't know if it fits your needs because parse is no longer a coroutine. def parse(page): soup=bs4.BeautifulSoup(page,'html.parser') soup.prettify() return soup.title async def fetch(session, url): async with session.get(url) as response: return await response.text() async def request(): async with aiohttp.ClientSession() as session: html = await fetch(session, "https://google.com") print(parse(html)) if __name__ == '__main__': loop=asyncio.get_event_loop() loop.run_until_complete(request()) This also works: def parse(page): soup=bs4.BeautifulSoup(page,'html.parser') soup.prettify() print(soup.title) async def request(): async with aiohttp.ClientSession() as session: async with session.get("https://google.com") as resp: parse(await resp.text()) And finally, your original code, passing an awaitable response object to parse then awaiting for page.text(). async def parse(page): soup=bs4.BeautifulSoup(await page.text(),'html.parser') soup.prettify() print(soup.title) async def request(): async with aiohttp.ClientSession() as session: async with session.get("https://google.com") as resp: await parse(resp)

API access

Get this solution programmatically \u2014 free, no authentication.

curl https://depscope.dev/api/error/2327268f55de5acd71720c156f10045f9eb83fcb2b274843133a66f8f4da5cae
hash \u00b7 2327268f55de5acd71720c156f10045f9eb83fcb2b274843133a66f8f4da5cae
TypeError: An asyncio.Future, a coroutine or an awaitable is… — DepScope fix | DepScope