# ChatGPT Scraper API

You have to send a GET request to **`https://api.scrapingdog.com/chatgpt`** with the below given parameters.

### Parameters

#### Scrapingdog Parameters

| Parameter                                                          | Description                                                                                                                                                                 |
| ------------------------------------------------------------------ | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| <p>api\_key<br></p><p><mark style="color:red;">required</mark></p> | <p>Your personal API key. Available on your dashboard <br><br>Type: <strong><code>String</code></strong></p>                                                                |
| html                                                               | <p>This will return the full HTML of the Flipkart page. <br><br>Default Value - <strong><code>false</code></strong> <br><br>Type - <strong><code>String</code></strong></p> |

#### Search Query

| Parameter                     | Description                                                                                           |
| ----------------------------- | ----------------------------------------------------------------------------------------------------- |
| <p>prompt<br><br>required</p> | <p>This is the prompt you want to pass in GPT. <br><br>Type: <strong><code>String</code></strong></p> |

### API Example

{% tabs %}
{% tab title="cURL" %}

```bash
curl "https://api.scrapingdog.com/chatgpt/?api_key=APIKEY&prompt=scrapingdog"
```

{% endtab %}

{% tab title="Python" %}

```python
import requests

api_key = "APIKEY"
url = "https://api.scrapingdog.com/chatgpt"

params = {
    "api_key": api_key,
    "prompt": "scrapingdog"
}

response = requests.get(url, params=params)

if response.status_code == 200:
    data = response.json()
    print(data)
else:
    print(f"Request failed with status code: {response.status_code}")

```

{% endtab %}

{% tab title="Node JS" %}

```js
const axios = require('axios');

const api_key = 'APIKEY';
const url = 'https://api.scrapingdog.com/chatgpt';

const params = {
  api_key: api_key,
  prompt: 'scrapingdog'
};

axios
  .get(url, { params: params })
  .then(function (response) {
    if (response.status === 200) {
      const data = response.data;
      console.log(data)
    } else {
      console.log('Request failed with status code: ' + response.status);
    }
  })
  .catch(function (error) {
    console.error('Error making the request: ' + error.message);
  });

```

{% endtab %}

{% tab title="PHP" %}

```php
<?php

// Set the API key and request parameters
$api_key = 'APIKEY';
$prompt = 'scrapingdog';

// Set the API endpoint
$url = 'https://api.scrapingdog.com/chatgpt/?api_key=' . $api_key . '&prompt=' . $prompt;

// Initialize cURL session
$ch = curl_init($url);

// Set cURL options
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);

// Execute the cURL request
$response = curl_exec($ch);

// Check if the request was successful
if ($response === false) {
    echo 'cURL error: ' . curl_error($ch);
} else {
    // Process the response data as needed
    echo $response;
}

// Close the cURL session
curl_close($ch);

```

{% endtab %}

{% tab title="Ruby" %}

```ruby
require 'net/http'
require 'uri'

# Set the API key and request parameters
api_key = 'APIKEY'
prompt = 'scrapingdog'

# Construct the API endpoint URL
url = URI.parse("https://api.scrapingdog.com/chatgpt/?api_key=#{api_key}&prompt=#{prompt}")

# Create an HTTP GET request
request = Net::HTTP::Get.new(url)

# Create an HTTP client
http = Net::HTTP.new(url.host, url.port)
http.use_ssl = true # Enable SSL (https)

# Send the request and get the response
response = http.request(request)

# Check if the request was successful
if response.is_a?(Net::HTTPSuccess)
  puts response.body # Process the response data as needed
else
  puts "HTTP request failed with code: #{response.code}, message: #{response.message}"
end

```

{% endtab %}

{% tab title="Java" %}

```java
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.net.HttpURLConnection;
import java.net.URL;
import java.io.IOException;

public class Main {
    public static void main(String[] args) {
        try {
            // Set the API key and request parameters
            String apiKey = "APIKEY";
            String query = "scrapingdog";

            // Construct the API endpoint URL
            String apiUrl = "https://api.scrapingdog.com/chatgpt/?api_key=" + apiKey
                    + "&prompt=" + prompt

            // Create a URL object from the API URL string
            URL url = new URL(apiUrl);

            // Open a connection to the URL
            HttpURLConnection connection = (HttpURLConnection) url.openConnection();

            // Set the request method to GET
            connection.setRequestMethod("GET");

            // Get the response code
            int responseCode = connection.getResponseCode();

            if (responseCode == 200) {
                // Read the response from the connection
                BufferedReader reader = new BufferedReader(new InputStreamReader(connection.getInputStream()));
                String inputLine;
                StringBuilder response = new StringBuilder();

                while ((inputLine = reader.readLine()) != null) {
                    response.append(inputLine);
                }
                reader.close();

                // Process the response data as needed
                System.out.println(response.toString());
            } else {
                System.out.println("HTTP request failed with response code: " + responseCode);
            }

            // Close the connection
            connection.disconnect();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }
}

```

{% endtab %}
{% endtabs %}

### API Response

```json
{    
  "conversation": [
        {
            "role": "user",
            "message": "scrapingdog"
        },
        {
            "role": "assistant",
            "response": [
                {
                    "type": "paragraph",
                    "text": "It sounds like you're asking about Scrapingdog. Scrapingdog is a web scraping API that allows users to scrape websites without worrying about the complexity of managing proxies, CAPTCHAs, or user-agent issues. It's designed to make it easier for developers to access data from websites at scale."
                },
                {
                    "type": "paragraph",
                    "text": "Some key features of Scrapingdog include:"
                },
                {
                    "type": "numbered_list",
                    "items": [
                        {
                            "heading": "Proxy Rotations",
                            "text": "It handles proxy rotation to avoid detection by websites, which helps avoid being blocked."
                        },
                        {
                            "heading": "Headless Browsing",
                            "text": "The API uses headless browsers (like Puppeteer or similar) to simulate real browsing behavior, making it harder for websites to detect scraping activity."
                        },
                        {
                            "heading": "Easy-to-use API",
                            "text": "You can make requests through a simple API, passing the URL of the page you want to scrape, and the API will return the HTML or structured data (JSON, CSV, etc.)."
                        },
                        {
                            "heading": "CAPTCHA Solving",
                            "text": "It can bypass CAPTCHAs that might normally block scraping attempts."
                        },
                        {
                            "heading": "Data Extraction",
                            "text": "Supports easy data extraction with clean output formats like JSON."
                        }
                    ]
                },
                {
                    "type": "paragraph",
                    "text": "Is there something specific you'd like to know about Scrapingdog? Or are you looking for help with setting up a scraper using it?"
                }
            ]
        }
    ]
}
```
