Created
June 3, 2020 13:45
-
-
Save Lwjivd/0b5f5c1a48551e4f4eed47300407c1a4 to your computer and use it in GitHub Desktop.
convert cURL to python requests
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Ultimate Software | |
| People First. | |
| View all 7 job openings! | |
| 15 | |
| 3 | |
| I'm trying to convert the following working request in curl to a python request (using the Requests http://docs.python-requests.org/en/v0.10.7/). | |
| curl --data 'query={"tags":["test1","test2"]}' http://www.test.com/match | |
| (please note, I've used a fake url but the command does work with the real url) | |
| The receiving end (ran in Flask) does this: | |
| @app.route("/match", methods=['POST']) | |
| def tagmatch(): | |
| query = json.loads(request.form['query']) | |
| tags = query.get('tags') | |
| ... does stuff ... | |
| return json.dump(stuff) | |
| In curl (7.30), ran on Mac OS X (10.9) the command above properly returns a json list that filtered using the tag query. | |
| My python script is as follows, it returns a 400 bad request. | |
| import requests | |
| payload = {"tags":["test1", "test2"]} | |
| # also tried payload = 'query={"tags":["test1","test2"]}' | |
| url = 'http://www.test.com/match' | |
| r = requests.post(url, data=payload) | |
| if __name__=='__main__': | |
| print r.text | |
| I feel I'm missing something small and any help would be appreciated. | |
| Thank you | |
| python python-2.7 curl python-requests | |
| share improve this question follow | |
| edited Jul 11 '18 at 3:58 | |
| OneCricketeer | |
| 110k1212 gold badges7474 silver badges158158 bronze badges | |
| asked Dec 9 '13 at 0:37 | |
| zalc | |
| 15311 gold badge11 silver badge44 bronze badges | |
| payload = {'query': '{"tags":["test1","test2"]}'} works, but flyer pointed me in the right direction. Thank you. I wasn't properly forming the python dict. – zalc Dec 9 '13 at 4:32 | |
| add a comment | |
| 7 Answers | |
| Active | |
| Oldest | |
| Votes | |
| 10 | |
| Your server is expecting JSON, but you aren't sending it. Try this: | |
| import requests | |
| import json | |
| payload = {'query': json.dumps({"tags":["test1", "test2"]})} | |
| url = 'http://www.test.com/match' | |
| r = requests.post(url, data=payload) | |
| if __name__=='__main__': | |
| print r.text | |
| share improve this answer follow | |
| answered Dec 9 '13 at 10:13 | |
| Lukasa | |
| 9,57133 gold badges2323 silver badges2929 bronze badges | |
| 2 | |
| In new versions of requests, there is also a json parameter to post. – Michel Samia Jan 6 '15 at 17:18 | |
| add a comment | |
| Ultimate Software | |
| People First. | |
| View all 7 job openings! | |
| 29 | |
| There is a wonderful open source cURL to Python Requests conversion helper at http://curl.trillworks.com. It isn't perfect, but helps out a lot of the time. Especially for converting Chrome "Copy as cURL" commands. There is also a node library if you need to do the conversions programmatically | |
| cURL from Chrome | |
| share improve this answer follow | |
| edited Jan 21 '16 at 20:30 | |
| answered Apr 16 '15 at 23:53 | |
| Gourneau | |
| 11.1k55 gold badges4040 silver badges4040 bronze badges | |
| 1 | |
| Thanks a ton! Perfect site! – Chaitanya Bapat Feb 4 '17 at 8:20 | |
| Really useful tip. – Vikrame Nov 26 '17 at 4:52 | |
| Not working ;)) – snr Jan 15 at 11:46 | |
| add a comment | |
| 4 | |
| Save your life | |
| A simpler approach would be: | |
| Open POSTMAN | |
| Click on the "import" tab on the upper left side. | |
| Select the Raw Text option and paste your cURL command. | |
| Hit import and you will have the command in your Postman builder! | |
| Hope this helps! | |
| credit: Onkaar Singh | |
| share improve this answer follow | |
| answered Feb 12 '19 at 17:53 | |
| MKRNaqeebi | |
| 42377 silver badges1010 bronze badges | |
| We need a solution to automate something, Postman is used only for inspecting or playing with APIs. My purpose is send 1000 of request from "Copy as Curl" and for that we need to automate this process. Save your life! – rohitcoder May 14 at 14:08 | |
| @rohitcoder use pycurl – MKRNaqeebi May 15 at 15:06 | |
| add a comment | |
| 2 | |
| try this: | |
| https://github.com/spulec/uncurl | |
| import uncurl | |
| print uncurl.parse("curl 'https://pypi.python.org/pypi/uncurl' -H | |
| 'Accept-Encoding: gzip,deflate,sdch'") | |
| share improve this answer follow | |
| answered Aug 18 '17 at 5:29 | |
| Pegasus | |
| 9731010 silver badges1717 bronze badges | |
| add a comment | |
| 1 | |
| I wrote an HTTP client plugin for Sublime Text called Requester, and one of its features is to convert calls to cURL to Requests, and vice versa. | |
| If you're using Sublime Text this is probably your fastest, easiest option. If not, here's the code that actually handles the conversion from cURL to Requests. It's based uncurl, but with various improvements and bug fixes. | |
| import argparse | |
| import json | |
| try: | |
| from urllib.parse import urlencode, parse_qsl | |
| except ImportError: # works for Python 2 and 3 | |
| from urllib import urlencode | |
| from urlparse import parse_qsl | |
| if __name__ == "__main__": | |
| parser = argparse.ArgumentParser() | |
| parser.add_argument('command') | |
| parser.add_argument('url') | |
| parser.add_argument('-X', '--request', default=None) | |
| parser.add_argument('-d', '--data', default=None) | |
| parser.add_argument('-G', '--get', action='store_true', default=False) | |
| parser.add_argument('-b', '--cookie', default=None) | |
| parser.add_argument('-H', '--header', action='append', default=[]) | |
| parser.add_argument('-A', '--user-agent', default=None) | |
| parser.add_argument('--data-binary', default=None) | |
| parser.add_argument('--compressed', action='store_true') | |
| parsed_args = parser.parse_args() | |
| method = 'get' | |
| if parsed_args.request: | |
| method = parsed_args.request | |
| base_indent = ' ' * 4 | |
| post_data = parsed_args.data or parsed_args.data_binary or '' | |
| if post_data: | |
| if not parsed_args.request: | |
| method = 'post' | |
| try: | |
| post_data = json.loads(post_data) | |
| except ValueError: | |
| try: | |
| post_data = dict(parse_qsl(post_data)) | |
| except: | |
| pass | |
| cookies_dict = {} | |
| if parsed_args.cookie: | |
| cookies = parsed_args.cookie.split(';') | |
| for cookie in cookies: | |
| key, value = cookie.strip().split('=') | |
| cookies_dict[key] = value | |
| data_arg = 'data' | |
| headers_dict = {} | |
| for header in parsed_args.header: | |
| key, value = header.split(':', 1) | |
| if key.lower().strip() == 'content-type' and value.lower().strip() == 'application/json': | |
| data_arg = 'json' | |
| if key.lower() == 'cookie': | |
| cookies = value.split(';') | |
| for cookie in cookies: | |
| key, value = cookie.strip().split('=') | |
| cookies_dict[key] = value | |
| else: | |
| headers_dict[key] = value.strip() | |
| if parsed_args.user_agent: | |
| headers_dict['User-Agent'] = parsed_args.user_agent | |
| qs = '' | |
| if parsed_args.get: | |
| method = 'get' | |
| try: | |
| qs = '?{}'.format(urlencode(post_data)) | |
| except: | |
| qs = '?{}'.format(str(post_data)) | |
| print(post_data) | |
| post_data = {} | |
| result = """requests.{method}('{url}{qs}',{data}\n{headers},\n{cookies},\n)""".format( | |
| method=method.lower(), | |
| url=parsed_args.url, | |
| qs=qs, | |
| data='\n{}{}={},'.format(base_indent, data_arg, post_data) if post_data else '', | |
| headers='{}headers={}'.format(base_indent, headers_dict), | |
| cookies='{}cookies={}'.format(base_indent, cookies_dict), | |
| ) | |
| print(result) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment