分享好友 系统运维首页 频道列表

CentOS6下配置Spark、Python开发环境记录 spark支持python吗

Centos  2023-02-09 23:080

1. 使用$SPARK_HOME/sbin/下的pyspark启动时,报错Traceback (most recent call last):

File "/home/joy/spark/spark/python/pyspark/shell.py", line 28, in

import py4j zipimport.ZipImportError: can't decompress data; zlib not available

首先按照搜索结果使用 yum install -y zlib* 安装了欠缺的包,但是仍报错,后使用sudo命令执行./pyspark即可正常执行。目前必须使用sudo命令才能正常执行,可能与环境设置有关,待解决——因为使用sudo命令安装,所以文件的所有者为root,chown更改所有者。
但是这样必须使用sudo安装pip,为了一劳永逸,重新编译python
http://blog.csdn.net/woszsj/article/details/16848871
解决方法:

1、安装依赖zlib、zlib-devel

2、重新编译安装Python

./configure
编辑Modules/Setup文件
找到下面这句,去掉注释

重新编译安装:make & make install
编译后报错仍有部分模块未编译成功
Python build finished, but the necessary bits to build these modules were not found:
_bsddb _curses _curses_panel
_sqlite3 _ssl _tkinter
bsddb185 bz2 dbm
dl gdbm imageop
参考
http://blog.csdn.net/huanle0610/article/details/41174943

无论报错信息如何,意思很明确,我们编译的时候,系统没有办法找到对应的模块信息,为了解决这些报错,我们就需要提前安装依赖包,这些依赖包对应列表如下(不一定完全):

模块 依赖 说明
_bsddb bsddb Interface to Berkeley DB library。Berkeley数据库的接口.bsddb is deprecated since 2.6. The ideal is to use the bsddb3 module.
_curses ncurses Terminal handling for character-cell displays。
_curses_panel ncurses A panel stack extension for curses。
_sqlite3 sqlite DB-API 2.0 interface for SQLite databases。SqlLite,CentOS可以安装sqlite-devel
_ssl openssl-devel.i686 TLS/SSL wrapper for socket objects。
_tkinter N/A a thin object-oriented layer on top of Tcl/Tk。如果不使用桌面程序可以忽略TKinter
bsddb185 old bsddb module 老的bsddb模块,可忽略。
bz2 bzip2-devel.i686 Compression compatible with bzip2。bzip2-devel
dbm bsddb Simple “database” interface。
dl N/A Call C functions in shared objects.Python2.6开始,已经弃用。
gdbm gdbm-devel.i686 GNU’s reinterpretation of dbm
imageop N/A Manipulate raw image data。已经弃用。
readline readline-devel GNU readline interface
sunaudiodev N/A Access to Sun audio hardware。这个是针对Sun平台的,CentOS下可以忽略
zlib Zlib Compression compatible with gzip

在CentOS下,可以安装这些依赖包:readline-devel,sqlite-devel,bzip2-devel.i686,openssl-devel.i686,gdbm-devel.i686,libdbi-devel.i686,ncurses-libs,zlib-devel.i686。完成这些安装之后,可以再次编译,上表中指定为弃用或者忽略的模块错误可以忽略。

在编译完成之后,就可以接着上面的第六步安装Python到指定目录下。安装完成之后,我们可以到安装目录下查看Python是否正常安装。

3. SparkSQL准备

参考文章:http://www.2cto.com/database/201504/392307.html

首先呢,看使用HiveContext都需要哪些要求,这里参考了这篇文章:http://www.cnblogs.com/byrhuangqiang/p/4012087.html
文章中有这么三个要求:
1、检查$SPARK_HOME/lib目录下是否有datanucleus-api-jdo-3.2.1.jar、datanucleus-rdbms-3.2.1.jar
、datanucleus-core-3.2.2.jar 这几个jar包。
2、检查$SPARK_HOME/conf目录下是否有从$HIVE_HOME/conf目录下拷贝过来的hive-site.xml。
3、提交程序的时候将数据库驱动程序的jar包指定到DriverClassPath,如bin/spark-submit --driver-class-path *.jar。或者在spark-env.sh中设置SPARK_CLASSPATH。

参考文章,将$HIVE_HOME/lib下以datanucleus开头的几个jar包复制到$SPARK_HOME/lib下;$HIVE_HOME/conf下的hive-site.xml 复制到 $SPARK_HOME/conf下;将$HIVE_HOME/lib 下的mysql-connector复制到$SPARK_HOME/jars下,

2. 启动spark-shell时报错

To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/01/17 11:42:58 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0
17/01/17 11:43:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/01/17 11:43:00 WARN Utils: Your hostname, node1 resolves to a loopback address: 127.0.0.1; using 192.168.85.128 instead (on interface eth1)
17/01/17 11:43:00 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
17/01/17 11:43:11 WARN HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect.  Use hive.hmshandler.retry.* instead
17/01/17 11:43:11 WARN HiveConf: HiveConf of name hive.server2.thrift.http.min.worker.threads does not exist
17/01/17 11:43:11 WARN HiveConf: HiveConf of name hive.mapjoin.optimized.keys does not exist
17/01/17 11:43:11 WARN HiveConf: HiveConf of name hive.mapjoin.lazy.hashtable does not exist
17/01/17 11:43:11 WARN HiveConf: HiveConf of name hive.datampi.maxslots does not exist
17/01/17 11:43:11 WARN HiveConf: HiveConf of name hive.metastore.ds.retry.attempts does not exist
17/01/17 11:43:11 WARN HiveConf: HiveConf of name hive.server2.thrift.http.max.worker.threads does not exist
17/01/17 11:43:11 WARN HiveConf: HiveConf of name hive.datampi.sendqueue does not exist
17/01/17 11:43:11 WARN HiveConf: HiveConf of name hive.optimize.multigroupby.common.distincts does not exist
17/01/17 11:43:11 WARN HiveConf: HiveConf of name hive.metastore.ds.retry.interval does not exist
17/01/17 11:43:11 WARN HiveConf: HiveConf of name hive.datampi.parallelism does not exist
17/01/17 11:43:11 WARN HiveConf: HiveConf of name hive.stats.map.parallelism does not exist
17/01/17 11:43:11 WARN HiveConf: HiveConf of name hive.datampi.memusedpercent does not exist
17/01/17 11:43:12 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/joy/spark/spark-2.1.0-bin-hadoop2.6/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/joy/spark/spark/jars/datanucleus-rdbms-3.2.9.jar."
17/01/17 11:43:12 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/joy/spark/spark-2.1.0-bin-hadoop2.6/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/joy/spark/spark/jars/datanucleus-api-jdo-3.2.6.jar."
17/01/17 11:43:12 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/joy/spark/spark/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/joy/spark/spark-2.1.0-bin-hadoop2.6/jars/datanucleus-core-3.2.10.jar."
17/01/17 11:43:16 WARN HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect.  Use hive.hmshandler.retry.* instead
17/01/17 11:43:16 WARN HiveConf: HiveConf of name hive.server2.thrift.http.min.worker.threads does not exist
17/01/17 11:43:16 WARN HiveConf: HiveConf of name hive.mapjoin.optimized.keys does not exist
17/01/17 11:43:16 WARN HiveConf: HiveConf of name hive.mapjoin.lazy.hashtable does not exist
17/01/17 11:43:16 WARN HiveConf: HiveConf of name hive.datampi.maxslots does not exist
17/01/17 11:43:16 WARN HiveConf: HiveConf of name hive.metastore.ds.retry.attempts does not exist
17/01/17 11:43:16 WARN HiveConf: HiveConf of name hive.server2.thrift.http.max.worker.threads does not exist
17/01/17 11:43:16 WARN HiveConf: HiveConf of name hive.datampi.sendqueue does not exist
17/01/17 11:43:16 WARN HiveConf: HiveConf of name hive.optimize.multigroupby.common.distincts does not exist
17/01/17 11:43:16 WARN HiveConf: HiveConf of name hive.metastore.ds.retry.interval does not exist
17/01/17 11:43:16 WARN HiveConf: HiveConf of name hive.datampi.parallelism does not exist
17/01/17 11:43:16 WARN HiveConf: HiveConf of name hive.stats.map.parallelism does not exist
17/01/17 11:43:16 WARN HiveConf: HiveConf of name hive.datampi.memusedpercent does not exist
17/01/17 11:43:22 ERROR ObjectStore: Version information found in metastore differs 0.13.0 from expected schema version 1.2.0. Schema verififcation is disabled hive.metastore.schema.verification so setting version.
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
  at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
  at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
  at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
  at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
  at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
  at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
  at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
  at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
  at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
  at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
  ... 47 elided
Caused by: java.lang.reflect.InvocationTargetException: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
  ... 58 more
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
  at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
  at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
  at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
  at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
  at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
  at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
  at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
  ... 63 more
Caused by: java.lang.reflect.InvocationTargetException: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.io.FileNotFoundException: File /hive/tmp does not exist
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
  ... 71 more
**Caused by: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.io.FileNotFoundException: File /hive/tmp does not exist**
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
  at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
  at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
  at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
  ... 76 more
Caused by: java.lang.RuntimeException: java.io.FileNotFoundException: File /hive/tmp does not exist
  at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
  at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
  ... 84 more
Caused by: java.io.FileNotFoundException: File /hive/tmp does not exist
  at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:537)
  at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:750)
  at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:527)
  at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:409)
  at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:599)
  at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
  at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
  ... 85 more

分析报错信息,发现出错原因为/hive/tmp不存在的FileNotExist错误,查找hive-site.xml文件,该路径为 hive.exec.scratchdir 值, hive.exec.scratchdir 为 HDFS路径,用于存储不同 map/reduce 阶段的执行计划和这些阶段的中间输出结果。

在终端输入hadoop fs -ls /hive,执行结果为

Found 2 items
drwxr-xr-x   - joy supergroup          0 2016-06-12 21:35 /hive/log
drwxr-xr-x   - joy supergroup          0 2017-01-16 14:17 /hive/tmp

权限分配不对,应该增加g+w,hadoop fs -chmod g+w /hive/tmp 以及hadoop fs -chmod g+w /hive/log ,但是依然报错不存在

在$SPARK_HOME/conf下的spark-env.sh中增加HADOOP_CONF_DIR,增加后报错信息变更为

java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
  at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
  at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
  at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
  at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
  at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
  at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
  at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
  at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
  at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
  at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
  ... 47 elided
Caused by: java.lang.reflect.InvocationTargetException: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
  ... 58 more
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
  at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
  at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
  at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
  at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
  at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
  at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
  at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
  ... 63 more
Caused by: java.lang.reflect.InvocationTargetException: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /hive/tmp on HDFS should be writable. Current permissions are: rwxrwxr-x
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
  ... 71 more
Caused by: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /hive/tmp on HDFS should be writable. Current permissions are: rwxrwxr-x
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
  at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
  at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
  at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
  ... 76 more
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /hive/tmp on HDFS should be writable. Current permissions are: rwxrwxr-x
  at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
  at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
  ... 84 more
Caused by: java.lang.RuntimeException: The root scratch dir: /hive/tmp on HDFS should be writable. Current permissions are: rwxrwxr-x
  at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)
  at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
  at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
  ... 85 more
<console>:14: error: not found: value spark
       import spark.implicits._
              ^
<console>:14: error: not found: value spark
       import spark.sql

出错信息指出文件夹权限不正确,再次使用hadoop fs -ls /hive

drwxrwxr-x   - joy supergroup          0 2016-06-12 21:35 /hive/log
drwxrwxr-x   - joy supergroup          0 2017-01-16 14:17 /hive/tmp

将文件夹权限改为777,最终启动成功

4. KeyError: u'y'

出错信息类似于以下:

Traceback (most recent call last):
  File "/Users/lyj/Programs/kiseliugit/MyPysparkCodes/test/spark2.0.py", line 5, in <module>
spark = SparkSession.builder.master("local").appName('test 2.0').config(conf=SparkConf()).getOrCreate()
  File "/Users/lyj/Programs/Apache/Spark2/python/pyspark/conf.py", line 104, in __init__
SparkContext._ensure_initialized()
  File "/Users/lyj/Programs/Apache/Spark2/python/pyspark/context.py", line 243, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
  File "/Users/lyj/Programs/Apache/Spark2/python/pyspark/java_gateway.py", line 116, in launch_gateway
java_import(gateway.jvm, "org.apache.spark.SparkConf")
  File "/Library/Python/2.7/site-packages/py4j/java_gateway.py", line 90, in java_import
return_value = get_return_value(answer, gateway_client, None, None)
  File "/Library/Python/2.7/site-packages/py4j/protocol.py", line 306, in get_return_value
value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
KeyError: u'y'

出错原因为py4j版本过低,使用pip upgrade升级即可
参考:http://***.com/questions/38637988/how-could-i-write-the-right-entry-point-in-spark-2-0-program-actually-pyspark-2

查看更多关于【Centos】的文章

展开全文
相关推荐
反对 0
举报 0
评论 0
图文资讯
热门推荐
优选好物
更多热点专题
更多推荐文章
centos6下同时安装python2和python3
#build-essential compile packagesyum groupinstall "Development Tools"yum install openssl-develyum install zlib-develyum install make gcc gcc-c++ kernel-develhttp://unix.stackexchange.com/questions/291737/zipimport-zipimporterror-cant-decomp

0评论2023-02-10677

CentOS下查看文件和文件夹大小 linux查看文件夹大小
当磁盘大小超过标准时会有报警提示,这时如果掌握df和du命令是非常明智的选择。  df可以查看一级文件夹大小、使用比例、档案系统及其挂入点,但对文件却无能为力。当磁盘大小超过标准时会有报警提示,这时如果掌握df和du命令是非常明智的选择。  df可以查

0评论2023-02-10731

centos7 rc.local脚本执行不成功
腾讯云 centos7   配置文件/etc/rc.local的内容如下:#!/bin/sh#secu_agent init monitor, install at Thu Aug 3 11:19:41 CST 2017/usr/local/sa/agent/init_check.sh/dev/null 21/usr/sbin/ntpdate ntpupdate.tencentyun.com /dev/null 21 /usr/local/qclo

0评论2023-02-10442

如何在centos7启动时自动挂载硬盘
在/etc/rc.local中加入如下的语句,这样就不用每次重启后手动挂载了(后面挂载的目录根据自己的需求而定):mount  /dev/sdb1 /usr/sharedfiles/sdbmount  /dev/sdc1 /usr/sharedfiles/sdcmount  /dev/sdb1 /root/sdbmount  /dev/sdc1 /root/sdc注意:可以

0评论2023-02-10535

centos7.2 开启防火墙
开启防火墙当我们修改了某些配置之后(尤其是配置文件的修改),firewall并不会立即生效。可以通过两种方式来激活最新配置 systemctl restart firewalld 和 firewall-cmd --reload 两种方式,前一种是重启firewalld服务,建议使用后一种“重载配置文件”

0评论2023-02-10997

CentOS7 安装 Python3.8后 pip 安装报错
[root@localhost Python-3.8.0]# pip install bs4Collecting bs4Using cached https://files.pythonhosted.org/packages/10/ed/7e8b97591f6f456174139ec089c769f89a94a1a4025fe967691de971f314/bs4-0.0.1.tar.gzERROR: Command errored out with exit status

0评论2023-02-10961

[转]How to install PHP 5.3 on CentOS
在DIAHosting买了一个VPS,自带PHP5.1.6.我想安装wordpress,但是由于版本比较新,要求PHP也要是5.2以上的,于是我就安装了PHP5.3.20--------------------------------------------------------------------------------------------------------------------

0评论2023-02-10997

外部访问docker内部容器centos的http服务
1.创建容器docker run -d -it -h dd -p 5000 --name bbbbb centosdd 是用户名 --name 后面是容器名字2.在我们开始安装Nginx及其他所需软件之前先安装一些前提软yum install python-setuptools yum -y install epel-release yum install python-pip pip instal

0评论2023-02-10480

Centos7安装yum命令 centos8如何安装yum
步骤如下http://mirrors.163.com/centos/7/os/x86_64/Packages/到上面这个网站去下载如下RPM包(为版本号,根据最新的自己替换即可)python-iniparse-.noarch.rpmyum-metadata-parser-.x86_64.rpmyum-.centos.noarch.rpmyum-plugin-fastestmirror-*.noarch.rp

0评论2023-02-10760

CentOS 7 - 安装Python 3
Enable Software Collections (SCL)Software Collections, also known as SCL is a community project that allows you to build, install, and use multiple versions of software on the same system, without affecting system default packages. By enabl

0评论2023-02-10592

centos清除历史命令
1、rm -f /root/.bash_profile  历史命令记录在此文件2、history -c   清除缓存会话退出后重新连接输入history可以看到历史命令清空了。两条命令的顺序不能乱

0评论2023-02-09473

更多推荐