Lab 7: Coroutines and Asynchronous Code

This lab explores coroutines and asynchronous programming, focusing on their implementation in both Python and Rust.

Understanding Coroutines and Asynchronous Programming

Key Concepts

  1. Coroutines are functions that can pause their execution and yield control back to the runtime, then resume execution later from where they left off.

  2. Async functions are implemented as coroutines that can be paused during I/O operations, allowing other code to run while waiting.

  3. Futures/Promises represent values that may not yet be available but will be resolved at some point.

  4. Event loops manage the execution of asynchronous code, deciding which coroutine to run next.

Asynchronous Programming in Python

Python’s asynchronous programming model is built around:

  • async keyword to define coroutines
  • await keyword to pause execution until a future is resolved
  • Event loop that schedules and executes coroutines

Example of a simple Python coroutine:

import asyncio
 
async def fetch_data():
    print("Start fetching")
    await asyncio.sleep(2)  # Simulates I/O operation
    print("Done fetching")
    return {"data": "here"}
 
async def main():
    result = await fetch_data()
    print(result)
 
asyncio.run(main())

Key characteristics:

  • Coroutines don’t do anything until executed by the event loop
  • Control flow is transferred back to the event loop at each await point
  • Cooperative multitasking: coroutines explicitly yield control

Asynchronous Programming in Rust

Futures in Rust

In Rust, asynchronous programming is built around the Future trait, which represents a computation that can produce a value at some point:

trait Future {
    type Output;
    fn poll(self: Pin<&mut Self>, cx: &mut Context) -> Poll<Self::Output>;
}
 
enum Poll<T> {
    Ready(T),
    Pending,
}

A Future doesn’t do anything on its own - it only makes progress when poll() is called by a runtime.

Async/Await in Rust

Rust’s async/await syntax provides a more ergonomic way to work with futures:

async fn read_data(file: &mut File) -> Result<Vec<u8>, io::Error> {
    let mut buffer = Vec::new();
    file.read_to_end(&mut buffer).await?;
    Ok(buffer)
}

This is transformed by the compiler into code that implements the Future trait.

Asynchronous Runtimes in Rust

Unlike Python, Rust doesn’t include an async runtime in its standard library. Instead, users choose from libraries like Tokio or async-std:

#[tokio::main]
async fn main() {
    // Asynchronous code here
}

Implementing Asynchronous HTTP Client

Below is a solution for the asynchronous HTTP client task using Tokio:

use std::net::ToSocketAddrs;
use std::time::Duration;
use tokio::net::TcpStream;
use tokio::io::{AsyncWriteExt, AsyncReadExt};
use tokio::time::timeout;
use trust_dns_resolver::TokioAsyncResolver;
use trust_dns_resolver::config::{ResolverConfig, ResolverOpts};
 
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let domain = "example.com";
    let path = "/";
    
    // Perform asynchronous DNS lookup
    let resolver = TokioAsyncResolver::tokio(
        ResolverConfig::default(),
        ResolverOpts::default()
    )?;
    
    let response = resolver.lookup_ip(domain).await?;
    let addresses: Vec<_> = response.iter().collect();
    
    if addresses.is_empty() {
        return Err("No addresses found for domain".into());
    }
    
    // Create futures for each connection attempt
    let mut connection_attempts = Vec::new();
    for addr in addresses {
        let socket_addr = (addr, 80).to_socket_addrs()?.next().unwrap();
        connection_attempts.push(timeout(
            Duration::from_secs(5), 
            TcpStream::connect(socket_addr)
        ));
    }
    
    // Wait for the first successful connection
    let mut stream = None;
    for attempt in connection_attempts {
        if let Ok(Ok(s)) = attempt.await {
            stream = Some(s);
            break;
        }
    }
    
    let mut stream = match stream {
        Some(s) => s,
        None => return Err("Failed to connect to any address".into()),
    };
    
    // Send HTTP request
    let request = format!(
        "GET {} HTTP/1.1\r\nHost: {}\r\nConnection: close\r\n\r\n",
        path, domain
    );
    stream.write_all(request.as_bytes()).await?;
    
    // Read and print response
    let mut buffer = Vec::new();
    stream.read_to_end(&mut buffer).await?;
    println!("{}", String::from_utf8_lossy(&buffer));
    
    Ok(())
}

Comparison with Multi-threaded Approach

Advantages of Async Approach

  1. Resource Efficiency:

    • Async code can handle many concurrent operations with far fewer threads
    • Lower memory overhead (no need for thread stacks for each connection)
    • Less context switching overhead
  2. Scalability:

    • Can handle thousands of concurrent connections with minimal resources
    • Well-suited for I/O-bound workloads
  3. Code Organization:

    • The code flow closely resembles sequential code, making it easier to follow
    • Error handling flows naturally with the code structure

Disadvantages

  1. Ecosystem Fragmentation:

    • Multiple competing runtimes (Tokio, async-std, smol)
    • Libraries need async versions of their APIs
    • Not all libraries support async
  2. Complexity:

    • The async transformation is not always transparent
    • Debugging can be more difficult
    • Stack traces are less informative
  3. Risks:

    • Blocking operations can stall the entire runtime
    • Long-running computations need to be moved to a separate thread pool

Discussion Questions

  1. Is the asynchronous version simpler and/or more efficient?

    The async version is more efficient in terms of resource usage, especially when dealing with many connections. For simple programs, the threading approach might be simpler conceptually, but as complexity grows, the async approach helps maintain cleaner code structure.

  2. To what extent does the use of multiple asynchronous runtime libraries fragment the Rust ecosystem?

    The fragmentation is a significant issue. Having multiple competing runtimes means:

    • Library authors must choose which runtime to support or implement compatibility layers
    • Users must choose a runtime and stick with it, or deal with interoperability issues
    • Documentation and examples may not work across different runtimes
    • The learning curve becomes steeper for newcomers
  3. How does asynchronous programming in Rust compare to Python?

    Rust’s approach differs from Python in several ways:

    • Python has a single, standard async runtime in the standard library
    • Rust requires explicit selection of a third-party runtime
    • Rust has more complex types due to its ownership system
    • Rust provides stronger guarantees about thread safety
    • Python’s GIL limits true parallelism, while Rust has no such limitation
    • Rust’s async transformations are more complex due to the need to maintain memory safety without garbage collection

    The async/await abstraction makes equal sense in both languages, but it’s implemented differently to accommodate each language’s design philosophy and constraints.