Skip to main content

Why proxy requests?

There are two main benefits of using a proxy server to route requests to ChainPatrol:
  1. Prevent leaking your API key - You can proxy requests to ChainPatrol through your own server which adds the ChainPatrol API key to each requests only on the server. This way, your API key is never exposed to the public.
  2. Protect your user’s privacy - By proxying requests through your own server, ChainPatrol servers will not be able to read the IP addresses of your users if you are using the /asset/check API. This is because the requests to ChainPatrol will originate from your server and not from your user’s browser.

How to proxy requests

You can proxy requests to ChainPatrol using any server-side technology. Below are some examples of how to proxy requests using different technologies.
The provided examples are just a starting point and do not include any error handling, rate limiting, or request whitelisting. You should implement these features to ensure that your proxy server is secure and reliable.
Cloudflare Workers is a serverless platform that allows you to run JavaScript code on Cloudflare’s edge network. This means that you can run your own serverless proxy server for cheap.

1. Create a new Cloudflare Worker

Go to the Cloudflare Workers dashboard and create a new worker.Create a new Cloudflare Worker

2. Copy and paste the code below

Make sure to replace the API_KEY variable with your ChainPatrol API key.
const API_KEY = 'YOUR_API_KEY';
const UPSTREAM_URL = 'https://app.chainpatrol.io/api/v2/asset/check';

export default {
  async fetch(request, env, ctx) {
    // Create a new request to the upstream server
    const url = new URL(UPSTREAM_URL);
    const upstreamRequest = new Request(url, request);

    // Set the ChainPatrol API key
    upstreamRequest.headers.set('X-API-KEY', API_KEY);

    // Return the response
    return fetch(upstreamRequest);
  },
};

3. Test the proxy server

Click on Save and deploy and then test the proxy server using curl:
curl --request POST \
  --url https://chainpatrol-proxy.your-subdomain.workers.dev/ \
  --header 'Content-Type: application/json' \
  --data '{
    "type": "URL",
    "content": "google.com"
  }'
Output
{"status":"ALLOWED"}

4. Use the proxy server

You can now use the proxy server by sending requests to the URL of your Cloudflare Worker. For example, if your Cloudflare Worker is named chainpatrol-proxy, you can send requests to https://chainpatrol-proxy.your-subdomain.workers.dev.
Deno Deploy is a serverless platform that allows you to run JavaScript code on Deno’s edge network. This means that you can run your own serverless proxy server for cheap.

1. Create a new Deno Deploy project

Go to the Deno Deploy dashboard and click on New project and fork the Hello World example.Create a new Deno Deploy project

2. Copy and paste the code below

import { serve } from "https://deno.land/[email protected]/http/server.ts";

const UPSTREAM_URL = "https://app.chainpatrol.io/api/v2/asset/check";

async function handler(request: Request): Promise<Response> {
    // Create a new request to the upstream server
    const url = new URL(UPSTREAM_URL);
    const upstreamRequest = new Request(url, request);

    // Set the ChainPatrol API key
    upstreamRequest.headers.set('X-API-KEY', Deno.env.API_KEY);

    // Return the response
    return fetch(upstreamRequest);
}

serve(handler);

3. Set the API_KEY environment variable

Add the API_KEY environment variable in the Deno editor and set your ChainPatrol API key:

3. Use the proxy server

You can now use the proxy server by sending requests to the URL of your Deno Deploy project. For example, if your Deno Deploy project is named chainpatrol-proxy, you can send requests to https://chainpatrol-proxy.deno.dev.

Important Considerations

  • When deploying your proxy server, consider the ingress and egress volume that you’ll handle from your clients. If you’re expecting a lot of traffic, you may want to consider using a serverless platform like Cloudflare Workers or Deno Deploy to reduce costs.
  • Try to keep your proxy server as close to your clients as possible. This will reduce latency and improve performance.
  • When deploying your proxy server, make sure to use HTTPS. This will ensure that your requests are encrypted and secure.
  • We reserve the right to block/rate-limit your API key if we detect excessive usage or abuse. Please make sure to handle errors gracefully in the event of unexpected downtime and let us know when you are planning on going live with your integration, so that we can provision the appropriate resources to meet your demand. If you have any questions, please contact us at [email protected].
I