为什么当我从今天的网页,而不是昨天刮也只出现此错误
问题描述:
即时通讯使用SQLite来存储某些股票 我不是能够登录到网站的一系列价格的,所以我在本地保存的网页中的数的时间来跟踪不同的价格。为什么当我从今天的网页,而不是昨天刮也只出现此错误
我的代码现在工作了两个节省香港专业教育学院以前所作,但是当我尝试使用任何从今天起我碰到下面的错误
UnicodeDecodeError: 'charmap' codec can't decode byte 0x90 in position 415367:character maps to <undefined>
我的代码如下
from bs4 import BeautifulSoup
from time import gmtime, strftime
import sqlite3
from sqlclean import *
count = 0
def create_tables(stock):
sql_command = """CREATE TABLE """ + stock +"""(
Stock_number INTEGER PRIMARY KEY,
BuyPrice REAL,
SellPrice REAL,
Time VARCHAR(30));"""
cursor.execute(sql_command)
def fill():
y = 0
for i in stock:
string = sqstring(i)
stock[y] = string
y = y + 1
for i in stock:
create_tables(str(i))
def populate():
x = 0
for i in stock:
cursor.execute("""
INSERT INTO """+ i +"""
(SellPrice,BuyPrice)
VALUES
(""" + sell[x]+""","""+ buy[x] +""")
""")
x = x + 1
def get_stocks(soup):
global count
rep1 = 0
rep2 = 0
if count == 0:
count = count + 1
for price in soup.find_all('span',{"class" : "tbox-list-button-sell"}):
sell.append(price.text)
for price in soup.find_all('span',{"class" : "tbox-list-button-buy"}):
buy.append(price.text)
for price in soup.find_all('div',{"class" : "window_title list-title"}):
a = price.text.strip()
stock.append(a)
fill()
populate()
else:
for price in soup.find_all('span',{"class" : "tbox-list-button-sell"}):
sell[rep1] = (price.text)
rep1 = rep1 + 1
for price in soup.find_all('span',{"class" : "tbox-list-button-buy"}):
buy[rep2] = (price.text)
rep2 = rep2 + 1
populate()
connection = sqlite3.connect("stocks.db")
cursor = connection.cursor()
web = ["C:/Users/Luke_2/Desktop/Computing/Coursework/Practice/Stocks1/demo.trading212.com.html","C:/Users/Luke_2/Desktop/Computing/Coursework/live/Stocks1/demo.trading212.com1.html","C:/Users/Luke_2/Desktop/Computing/Coursework/live/Stocks1/demo.trading212.com10.24.html"]
stock=[]
sell = []
buy = []
def run():
for i in web:
soup = BeautifulSoup(open(i),"html.parser")
get_stocks(soup)
run()
connection.commit()
connection.close()
答
节省您没有告诉open()函数在读取文件时使用哪种编解码器,例如如果您使用UTF-8在文件中:
在Python 2.7:
import io
...
def run():
for i in web:
with io.open(i, encoding='utf-8') as infile:
soup = BeautifulSoup(infile,"html.parser")
对于Python3
def run():
for i in web:
with open(i, encoding='utf-8') as infile:
soup = BeautifulSoup(infile,"html.parser")