EssayGhost Assignment代写,Essay代写,网课代修,Quiz代考

EssayGhost-Essay代写,作业代写,网课代修代上,cs代写代考

一站式网课代修,论文代写

高质量的Assignment代写、Paper代写、Report代写服务

EG1hao
网课代修代上,cs代写代考
Python代写
您的位置: 主页 > 编程案例 > Python代写 >
代写Python:我的爬虫工具类-python爬虫 - Python代写
发布时间:2021-07-25 21:37:41浏览次数:
#getPageByPostJson data input is a dict #getPage(url,data=xx) getPage(url,requestPars.=xx) @staticmethod def getPageByJson(url,proxy=None,data={}, referer = None ,cookie = None ,userAgent = None,cookiePath=None): # print url crawlerTool.log = crawlerTool.log url page_buf = i = 0 for i in range(1): # print url try: if proxy: handlers = [urllib2.ProxyHandler({ http : http://%s/ % proxy, https : http://%s/ % proxy})] opener = urllib2.build_opener(*handlers) else: opener = urllib2.build_opener() if type(data) == type({}):data=json.dumps(data) method = urllib2.Request(url,data=data)#要留意None相匹配null method.add_header( Content-Type , application/json ) if referer: method.add_header( Referer , referer) if cookiePath: method.add_header( Cookie , crawlerTool.readCookie(cookiePath)) if cookie: method.add_header( Cookie , cookie) if userAgent: method.add_header( User-Agent , userAgent) else: method.add_header( User-Agent , Mozilla/5.0 (X11; Linux i686) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.135 Safari/537.36 ) method.add_header( Accept-Language , en-US,en;q=0.5 ) result = opener.open(method, timeout=10) page_buf = result.read() return page_buf except urllib2.URLError, reason: crawlerTool.log = crawlerTool.log str(reason) return str(reason) except Exception, reason: crawlerTool.log = crawlerTool.log str(reason) raise Exception(reason) pass

所有的编程代写范围:essayghost为美国、加拿大、英国、澳洲的留学生提供C语言代写、代写C语言、C语言代做、代做C语言、数据库代写、代写数据库、数据库代做、代做数据库、Web作业代写、代写Web作业、Web作业代做、代做Web作业、Java代写、代写Java、Java代做、代做Java、Python代写、代写Python、Python代做、代做Python、C/C++代写、代写C/C++、C/C++代做、代做C/C++、数据结构代写、代写数据结构、数据结构代做、代做数据结构等留学生编程作业代写服务。