1. 首页 > 云服务器

基于alpine用dockerfile创建的爬虫Scrapy镜像的实现

一、下载alpine镜像

?

1

2

3

4

5

6

7

8

9

10
[root@DockerBrian ~]# docker pull alpine

Using default tag: latest

Trying to pull repository docker.io/library/alpine ...

latest: Pulling from docker.io/library/alpine

4fe2ade4980c: Pull complete

Digest: sha256:621c2f39f8133acb8e64023a94dbdf0d5ca81896102b9e57c0dc184cadaf5528

Status: Downloaded newer image for docker.io/alpine:latest

[root@docker43 ~]# docker images

REPOSITORY TAG IMAGE ID CREATED SIZE

docker.io/alpine latest 196d12cf6ab1 3 weeks ago 4.41 MB

二、编写Dockerfile

创建scrapy目录存放dockerfile文件

?

1

2

3

4

5

6

7
[root@DockerBrian ~]# mkdir /opt/alpineDockerfile/

[root@DockerBrian ~]# cd /opt/alpineDockerfile/

[root@DockerBrian alpineDockerfile]# mkdir scrapy && cd scrapy && touch Dockerfile

[root@DockerBrian alpineDockerfile]# cd scrapy/

[root@DockerBrian scrapy]# ll

总用量 4

-rw-r--r-- 1 root root 1394 10月 10 11:36 Dockerfile

编写dockerfile文件

?

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37
# 指定创建的基础镜像

FROM alpine

# 作者描述信息

MAINTAINER alpine_python3_scrapy (zhujingzhi@123.com)

# 替换阿里云的源

RUN echo "http://mirrors.aliyun.com/alpine/latest-stable/main/" > /etc/apk/repositories && \\

echo "http://mirrors.aliyun.com/alpine/latest-stable/community/" >> /etc/apk/repositories

# 同步时间

# 更新源、安装openssh 并修改配置文件和生成key 并且同步时间

RUN apk update && \\

apk add --no-cache openssh-server tzdata && \\

cp /usr/share/zoneinfo/Asia/Shanghai /etc/localtime && \\

sed -i "s/#PermitRootLogin.*/PermitRootLogin yes/g" /etc/ssh/sshd_config && \\

ssh-keygen -t rsa -P "" -f /etc/ssh/ssh_host_rsa_key && \\

ssh-keygen -t ecdsa -P "" -f /etc/ssh/ssh_host_ecdsa_key && \\

ssh-keygen -t ed25519 -P "" -f /etc/ssh/ssh_host_ed25519_key && \\

echo "root:h056zHJLg85oW5xh7VtSa" | chpasswd

# 安装Scrapy依赖包(必须安装的依赖包)

RUN apk add --no-cache python3 python3-dev gcc openssl-dev openssl libressl libc-dev linux-headers libffi-dev libxml2-dev libxml2 libxslt-dev openssh-client openssh-sftp-server

# 安装环境需要pip包(这里的包可以按照需求添加或者删除)

RUN pip3 install --default-timeout=100 --no-cache-dir --upgrade pip setuptools pymysql pymongo redis scrapy-redis ipython Scrapy requests

# 启动ssh脚本

RUN echo "/usr/sbin/sshd -D" >> /etc/start.sh && \\

chmod +x /etc/start.sh

# 开放22端口

EXPOSE 22

# 执行ssh启动命令

CMD ["/bin/sh","/etc/start.sh"]

实现了容器可以SSH远程访问 基于Python3 环境安装的Scrapy,通过start.sh脚本启动SSH服务

三、创建镜像

创建镜像

?

1
[root@DockerBrian scrapy]# docker build -t scrapy_redis_ssh:v1 .

查看镜像

?

1

2

3

4
[root@DockerBrian scrapy]# docker images

REPOSITORY TAG IMAGE ID CREATED SIZE

scrapy_redis_ssh v1 b2c95ef95fb9 4 hours ago 282 MB

docker.io/alpine latest 196d12cf6ab1 4 weeks ago 4.41 MB

四、创建容器

创建容器(名字为scrapy10086 远程端口是映射宿主机10086端口)

复制代码 代码如下:

docker run -itd –restart=always –name scrapy10086 -p 10086:22 scrapy_redis_ssh:v1

查看容器

?

1

2

3
[root@DockerBrian scrapy]# docker ps

CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES

7fb9e69d79f5 b2c95ef95fb9 "/bin/sh /etc/star..." 3 hours ago Up 3 hours 0.0.0.0:10086->22/tcp scrapy10086

登录容器

?

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18
[root@DockerBrian scrapy]# ssh root@127.0.0.1 -p 10086

The authenticity of host '[127.0.0.1]:10086 ([127.0.0.1]:10086)' can't be established.

ECDSA key fingerprint is SHA256:wC46AU6SLjHyEfQWX6d6ht9MdpGKodeMOK6/cONcpxk.

ECDSA key fingerprint is MD5:6a:b7:31:3c:63:02:ca:74:5b:d9:68:42:08:be:22:fc.

Are you sure you want to continue connecting (yes/no)? yes

Warning: Permanently added '[127.0.0.1]:10086' (ECDSA) to the list of known hosts.

root@127.0.0.1's password: # 这里的密码就是dockerfile中定义的 echo "root:h056zHJLg85oW5xh7VtSa" | chpasswd

Welcome to Alpine!

The Alpine Wiki contains a large amount of how-to guides and general

information about administrating Alpine systems.

See <http://wiki.alpinelinux.org>.

You can setup the system with the command: setup-alpine

You may change this message by editing /etc/motd.

7363738cc96a:~#

五、测试

创建个scrapy项目测试

?

1

2

3

4

5

6

7

8

9

10

11

12

13

14
7363738cc96a:~# scrapy startproject test

New Scrapy project 'test', using template directory '/usr/lib/python3.6/site-packages/scrapy/templates/project', created in:

/root/test

You can start your first spider with:

cd test

scrapy genspider example example.com

7363738cc96a:~# cd test/

7363738cc96a:~/test# ls

scrapy.cfg test

7363738cc96a:~/test# cd test/

7363738cc96a:~/test/test# ls

__init__.py __pycache__ items.py middlewares.py pipelines.py settings.py spiders

7363738cc96a:~/test/test#

测试成功

以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持快网idc。

原文链接:https://www.cnblogs.com/zhujingzhi/p/9766965.html

本文由服务器主机测评网发布,不代表服务器主机测评网立场,转载联系作者并注明出处:https://www.kuaiidc.com/fuwuqi/141.html

联系我们

在线咨询:点击这里给我发消息

Q Q:1524578900