兼容工具
兼容说明
大多数基于AWS S3开发的工具都可以设置访问地址。通过设置这些工具的访问地址为BOS的AWS S3服务域名,您就可以使用这些工具访问BOS。下面以一些常用SDK和工具为例,说明如何对接BOS。
说明:
字串 含义 $ACCESS_KEY 百度智能云账号的Access key $SECRET_KEY 百度智能云账号的Secret key 
AWS SDK for Python
- 
安装Boto类库:
Plain Text1pip install boto3 - 
使用AWS SDK for python访问Bos
Plain Text1import boto3 2from botocore.client import Config 3s3 = boto3.client( 4 's3', 5 aws_access_key_id=$ACCESS_KEY, 6 aws_secret_access_key=$SECRET_KEY, 7 endpoint_url='http://s3.bj.bcebos.com', 8 region_name='bj', 9 config = Config( 10 signature_version='s3v4', 11 ) 12 ) 13# Use S3 client 14s3.create_bucket(...) 
AWS SDK for Java
- 
添加依赖包到pom.xml
Plain Text1//添加下面AWS Java SDK依赖包到pom.xml 2<dependency> 3 <groupId>com.amazonaws</groupId> 4 <artifactId>aws-java-sdk</artifactId> 5 <version>1.11.82</version> 6</dependency> - 
使用AWS SDK for java访问Bos
Plain Text1import java.io.IOException; 2import com.amazonaws.services.s3.AmazonS3; 3import com.amazonaws.services.s3.AmazonS3Client; 4import com.amazonaws.services.s3.model.*; 5import com.amazonaws.services.s3.S3ClientOptions; 6import com.amazonaws.auth.BasicAWSCredentials; 7import com.amazonaws.SDKGlobalConfiguration; 8 9public class S3Sample { 10 public static void main(String[] args) throws IOException { 11 System.setProperty(SDKGlobalConfiguration.ENABLE_S3_SIGV4_SYSTEM_PROPERTY, "true"); 12 AmazonS3 s3 = new AmazonS3Client(new BasicAWSCredentials($ACCESS_KEY,$SECRET_KEY)); 13 s3.setEndpoint("s3.bj.bcebos.com"); 14 S3ClientOptions options = new S3ClientOptions(); 15 options.withChunkedEncodingDisabled(true); 16 s3.setS3ClientOptions(options); 17 18 // Use S3 Client 19 s3.createBucket(...); 20 } 21 22} - 
编译代码
Plain Text1mvn package 
AWS PHP SDK
- 安装:下载aws.phar,更多安装方式见AWS PHP SDK安装方式。
 - 
使用AWS SDK for PHP访问Bos
Plain Text1<?php 2require 'aws.phar'; 3use Aws\S3\S3Client; 4use Aws\Exception\AwsException; 5 6$s3Client = new S3Client([ 7 'version' => 'latest', 8 'region' => 'bj', 9 'credentials' => [ 10 'key' => $ACCESS_KEY, 11 'secret' => $SECRET_KEY, 12 ], 13 'endpoint' => 'https://s3.bj.bcebos.com', 14 'signature_version' => 'v4', 15]); 16 17$buckets = $s3Client->listBuckets(); 18foreach ($buckets['Buckets'] as $bucket){ 19 echo $bucket['Name']."\n"; 20} 
AWS Golang SDK
- 
安装
Plain Text1go get -u github.com/aws/aws-sdk-go - 
使用AWS SDK for Golang访问Bos
Plain Text1import ( 2 "github.com/aws/aws-sdk-go/aws" 3 "github.com/aws/aws-sdk-go/aws/session" 4 "github.com/aws/aws-sdk-go/service/s3" 5 "github.com/aws/aws-sdk-go/aws/credentials" 6) 7conf := &aws.Config{ 8 Region: aws.String("bj"), 9 Endpoint: aws.String("s3.bj.bcebos.com"), 10 Credentials: credentials.NewStaticCredentials($ACCESS_KEY, $SECRET_KEY,, ""), 11 } 12sess := session.Must(session.NewSessionWithOptions(session.Options{Config:*conf})) 13svc := s3.New(sess) 14getObjectParams := &s3.GetObjectInput{ 15 Bucket: aws.String("my-bucket"), 16 Key: aws.String("my-object"), 17} 18getObjectResp, err := svc.GetObject(getObjectParams) 19if err != nil { 20 fmt.Println(err.Error()) 21 return 22} 
AWS JavaScript SDK
1. 安装与前置依赖
环境要求
- npm 或 yarn 包管理器
 - Node.js环境:Node.js 14.x 或更高版本
 - 
浏览器环境
- 现代浏览器环境(支持ES6+)
 - 支持ES Modules或使用打包工具(如Webpack、Vite等)
 
 
安装依赖包
1# 安装核心S3客户端
2npm install @aws-sdk/client-s3
3
4# 安装分片上传工具
5npm install @aws-sdk/lib-storage
6
7# 安装STS客户端(用于临时凭证)
8npm install @aws-sdk/client-sts
9
10# 安装凭证提供者(浏览器环境)
11npm install @aws-sdk/credential-providers
            浏览器环境CDN
可以通过skypack、jsdelivr等CDN服务引入在线的ES模块
1<script type="module">
2  import {S3Client, CreateBucketCommand, PutObjectCommand, GetObjectCommand, DeleteObjectCommand} from 'https://cdn.skypack.dev/@aws-sdk/client-s3@3.826.0';
3  import {Upload} from 'https://cdn.skypack.dev/@aws-sdk/lib-storage@3.826.0';
4</script>
            服务域名
2. 初始化配置
2.1 Node.js 服务端访问方式(固定AK/SK)
1import { S3Client } from '@aws-sdk/client-s3';
2
3const s3Client = new S3Client({
4    region: 'bj',
5    endpoint: 'https://s3.bj.bcebos.com',
6    credentials: {
7        accessKeyId: '<your-access-key>',
8        secretAccessKey: '<your-secret-access-key>'
9    }
10});
            2.2 浏览器STS访问方式
1import {S3Client} from '@aws-sdk/client-s3';
2
3const s3Client = new S3Client({
4    region: 'bj',
5    endpoint: 'https://s3.bj.bcebos.com',
6    credentials: {
7        accessKeyId: '<your-access-key>',
8        secretAccessKey: '<your-secret-access-key>'
9        /** 临时凭证,可选,通过百度云STS服务获取 */
10        sessionToken: '<your-session-token>'
11    }
12});
            3. 常见方法示例
3.1 创建存储桶
1import { CreateBucketCommand } from '@aws-sdk/client-s3';
2
3/**
4 * 创建存储桶
5 *
6 * @param {string} bucketName 存储桶名称
7 */
8async function createBucket(bucketName) {
9  try {
10    const command = new CreateBucketCommand({
11      /** 存储桶名称 */
12      Bucket: bucketName,
13      /** 存储桶配置 */
14      CreateBucketConfiguration: {
15        /** 存储桶所在区域 */
16        LocationConstraint: 'bj'
17      }
18    });
19
20    const response = await s3Client.send(command);
21    console.log('存储桶创建成功:', response);
22    return response;
23  } catch (error) {
24    console.error('创建存储桶失败:', error);
25    throw error;
26  }
27}
28
29// 使用示例
30await createBucket('my-test-bucket-beijing-2024');
            3.2 简单上传(PutObject)
Node.js环境
1import fs from 'fs';
2import {PutObjectCommand} from '@aws-sdk/client-s3';
3
4/**
5 * Node.js环境上传文件
6 *
7 * @param {string} bucketName 存储桶名称
8 * @param {string} key 对象名称
9 * @param {string} filePath 本地文件路径
10 * @returns
11 */
12async function uploadFile(bucketName, key, filePath) {
13  try {
14    const fileContent = fs.readFileSync(filePath);
15
16    const command = new PutObjectCommand({
17      Bucket: bucketName,
18      Key: key,
19      Body: fileContent,
20      ContentType: 'application/octet-stream',
21      Metadata: {
22        'uploaded-by': 'aws-sdk-v3',
23        'upload-time': new Date().toISOString()
24      }
25    });
26
27    const response = await s3Client.send(command);
28    console.log('文件上传成功:', response);
29    return response;
30  } catch (error) {
31    console.error('文件上传失败:', error);
32    throw error;
33  }
34}
35
36// 使用示例
37await uploadFile('my-test-bucket', 'documents/test.pdf', './local-file.pdf');
            浏览器环境
1import {PutObjectCommand} from '@aws-sdk/client-s3';
2
3/**
4 * 浏览器环境上传文件
5 *
6 * @param {string} bucketName 存储桶名称
7 * @param {string} key 对象名称
8 * @param {File} file 文件对象
9 * @returns
10 */
11async function uploadFileFromBrowser(bucketName, key, file) {
12  try {
13    const command = new PutObjectCommand({
14      Bucket: bucketName,
15      Key: key,
16      Body: file,
17      ContentType: file.type || 'application/octet-stream'
18    });
19    
20    const response = await s3Client.send(command);
21    console.log('文件上传成功:', response);
22    return response;
23  } catch (error) {
24    console.error('文件上传失败:', error);
25    throw error;
26  }
27}
28
29// 使用示例
30// 假设页面中存在上传控件:<input type="file" id="fileUpload">
31const fileInput = document.getElementById('fileUpload');
32const file = fileInput.files[0];
33await uploadFileFromBrowser('my-test-bucket', 'documents/test.pdf', file);
            3.3 分片上传
Node.js环境
1import fs from 'fs';
2import {Upload} from '@aws-sdk/lib-storage';
3
4
5/**
6 * Node.js环境分片上传文件
7 *
8 * @param {string} bucketName 存储桶名称
9 * @param {string} key 对象名称
10 * @param {string} filePath 本地文件路径
11 */
12async function multipartUpload(bucketName, key, filePath) {
13  try {
14    const fileStream = fs.createReadStream(filePath);
15    const upload = new Upload({
16      client: s3Client,
17      params: {
18        Bucket: bucketName,
19        Key: key,
20        Body: fileStream,
21        ContentType: 'application/octet-stream'
22      },
23      /** 并发上传数量,可选参数 */
24      queueSize: 4,
25      /** 分片大小,可选参数,至少5MB */
26      partSize: 5 * 1024 * 1024,
27      /** 失败时是否保留已上传的分片,可选参数 */
28      leavePartsOnError: false
29    });
30
31    /** 监听上传进度 */
32    upload.on('httpUploadProgress', (progress) => {
33      console.log(`上传进度: ${Math.round(progress.loaded / progress.total * 100)}%`);
34    });
35
36    const response = await upload.done();
37    console.log('分片上传完成:', response);
38    return response;
39  } catch (error) {
40    console.error('分片上传失败:', error);
41    throw error;
42  }
43}
44
45// 使用示例
46await multipartUpload('my-test-bucket', 'large-files/video.mp4', './large-video.mp4');
            浏览器环境
1import {Upload} from '@aws-sdk/lib-storage';
2
3/**
4 * 浏览器环境分片上传文件
5 *
6 * @param {string} bucketName 存储桶名称
7 * @param {string} key 对象名称
8 * @param {File} file 文件对象
9 */
10async function multipartUploadBrowser(bucketName, key, file) {
11  try {
12    const upload = new Upload({
13      client: s3Client,
14      params: {
15        Bucket: bucketName,
16        Key: key,
17        Body: file,
18        ContentType: file.type || 'application/octet-stream'
19      },
20      /** 并发上传数量,可选参数 */
21      queueSize: 4,
22      /** 分片大小,可选参数,至少5MB */
23      partSize: 5 * 1024 * 1024,
24      /** 失败时是否保留已上传的分片,可选参数 */
25      leavePartsOnError: false
26    });
27
28    upload.on('httpUploadProgress', (progress) => {
29      const percent = Math.round(progress.loaded / progress.total * 100);
30      console.log(`上传进度: ${percent}%`);
31      // 更新进度条UI
32    });
33
34    const response = await upload.done();
35    console.log('分片上传完成:', response);
36    return response;
37  } catch (error) {
38    console.error('分片上传失败:', error);
39    throw error;
40  }
41}
42
43// 使用示例
44// 假设页面中存在上传控件:<input type="file" id="fileUpload">
45const fileInput = document.getElementById('fileUpload');
46const file = fileInput.files[0];
47await multipartUploadBrowser('my-test-bucket', 'large-files/video.mp4', file);
            3.4 下载文件
Node.js环境
1import fs from 'node:fs';
2import {pipeline} from 'node:stream/promises';
3import {GetObjectCommand} from '@aws-sdk/client-s3';
4
5/**
6 * Node.js环境下载文件
7 *
8 * @param {string} bucketName 存储桶名称
9 * @param {string} key 对象名称
10 * @param {string} downloadPath 本地下载路径
11 */
12async function downloadFile(bucketName, key, downloadPath) {
13  try {
14    const command = new GetObjectCommand({
15      Bucket: bucketName,
16      Key: key
17    });
18
19    const response = await s3Client.send(command);
20
21    const writeStream = fs.createWriteStream(downloadPath);
22    await pipeline(response.Body, writeStream);
23    console.log('文件下载成功:', downloadPath);
24    return downloadPath;
25  } catch (error) {
26    console.error('文件下载失败:', error);
27    throw error;
28  }
29}
30
31// 使用示例
32await downloadFile('my-test-bucket', 'documents/test.pdf', './downloaded-file.pdf');
            浏览器环境
1import {GetObjectCommand} from '@aws-sdk/client-s3';
2
3/**
4 * 浏览器环境下载文件
5 *
6 * @param {string} bucketName 存储桶名称
7 * @param {string} key 对象名称
8 */
9async function downloadFileInBrowser(bucketName, key) {
10  try {
11    const command = new GetObjectCommand({
12      Bucket: bucketName,
13      Key: key
14    });
15    
16    const response = await s3Client.send(command);
17    
18    const blob = await response.Body.transformToByteArray();
19    const file = new Blob([blob], {
20      type: response.ContentType || 'application/octet-stream'
21    });
22    
23    const url = URL.createObjectURL(file);
24    const downloadLink = document.createElement('a');
25    downloadLink.href = url;
26    downloadLink.download = key.split('/').pop();
27    downloadLink.click();
28    URL.revokeObjectURL(url);
29   
30    
31    console.log(`文件下载成功: ${key}`);
32    return blob;
33  } catch (error) {
34    console.error('文件下载失败:', error);
35    throw error;
36  }
37}
38
39// 使用示例
40await downloadFileInBrowser('my-test-bucket', 'documents/test.pdf');
            3.5 获取文件下载链接
1import {GetObjectCommand} from '@aws-sdk/client-s3';
2import {getSignedUrl} from '@aws-sdk/s3-request-presigner';
3
4/**
5 * 获取对象的下载链接
6 *
7 * @param {string} bucketName 存储桶名称
8 * @param {string} key 对象名称
9 * @param {number} expiresIn 过期时间,单位秒
10 * @returns
11 */
12async function getDownloadUrl(bucketName, key, expiresIn = 3600) {
13  try {
14    const command = new GetObjectCommand({
15      Bucket: bucketName,
16      Key: key
17    });
18
19    const url = await getSignedUrl(s3Client, command, {expiresIn});
20    console.log('预签名下载URL:', url);
21    return url;
22  } catch (error) {
23    console.error('生成下载URL失败:', error);
24    throw error;
25  }
26}
27
28// 使用示例
29const downloadUrl = await getDownloadUrl('my-test-bucket', 'documents/test.pdf', 7200);
            3.6 删除文件
1import {DeleteObjectCommand, DeleteObjectsCommand } from '@aws-sdk/client-s3';
2
3/**
4 * 删除单文件
5 *
6 * @param {string} bucketName 存储桶名称
7 * @param {string} key 对象名称
8 */
9async function deleteFile(bucketName, key) {
10  try {
11    const command = new DeleteObjectCommand({
12      Bucket: bucketName,
13      Key: key
14    });
15
16    const response = await s3Client.send(command);
17    console.log('文件删除成功:', response);
18    return response;
19  } catch (error) {
20    console.error('文件删除失败:', error);
21    throw error;
22  }
23}
24
25/**
26 * 批量删除文件
27 *
28 * @param {string} bucketName 存储桶名称
29 * @param {string[]} keys 对象名称数组
30 */
31async function deleteMultipleFiles(bucketName, keys) {
32  try {
33    const command = new DeleteObjectsCommand({
34      Bucket: bucketName,
35      Delete: {
36        /** 对象名称列表 */
37        Objects: keys.map(key => ({Key: key})),
38        /** 是否开启静默删除 */
39        Quiet: false
40      }
41    });
42
43    const response = await s3Client.send(command);
44    console.log('文件批量删除成功:', response);
45    return response;
46  } catch (error) {
47    console.error('文件批量删除失败:', error);
48    throw error;
49  }
50}
51
52// 使用示例
53await deleteFile('my-test-bucket', 'documents/test.pdf');
54await deleteMultipleFiles('my-test-bucket', ['file1.txt', 'file2.txt', 'folder/file3.jpg']);
            4. 常见错误说明
4.1 服务端错误
常见的错误和对应的描述参考:https://cloud.baidu.com/doc/BOS/s/Ajwvysfpl
1// 错误处理示例
2async function handleS3Errors() {
3  try {
4    // 您的S3操作代码
5  } catch (error) {
6    console.error('S3操作失败:', error);
7    /** 错误码 */
8    console.log(error.Code);
9    /** 错误消息 */
10    console.log(error.message);
11    /** HTTP状态码 */
12    console.log(error.$metadata.httpStatusCode);
13    /** 请求ID */
14    console.log(error.RequestId);
15    console.log(error.$metadata.requestId);
16    /** x-bce-debug-id */
17    console.log(error.$metadata.extendedRequestId);
18  }
19}
            4.2 常见问题及解决方案
1问题1:权限不足
2错误:AccessDenied: User is not authorized to perform: s3:PutObject
3解决:检查IAM策略是否包含必要权限
            1问题2:CORS配置问题(浏览器环境)
2错误:Access to XMLHttpRequest blocked by CORS policy
3解决:在存储桶中配置CORS
            4.3 调试建议
1. 启用详细日志:
1import { S3Client } from '@aws-sdk/client-s3';
2
3const s3Client = new S3Client({
4    region: 'bj',
5    endpoint: 'https://s3.bj.bcebos.com',
6    credentials: {
7        accessKeyId: '<your-access-key>',
8        secretAccessKey: '<your-secret-access-key>'
9    }
10    /* 启用详细日志 */
11    logger: console,
12    /* 请求设置 */
13    requestHandler: {
14        connectionTimeout: 5000,
15        socketTimeout: 10000
16    }
17});
            2. 检查网络连接:
1# 测试Endpoint连接
2curl -I https://s3.bj.bcebos.com
            AWS CLI工具
- 
安装AWS CLI工具
Plain Text1pip install awscli - 使用AWS CLI访问BOS
 
- 
编辑配置文件
Plain Text1$ aws configure 2 3AWS Access Key ID [Nonel: <access_key_id> 4AWS Secret Access Key [Nonel: <access_key_secret> 5Default region name [None]: auto 6Default output format [Nonel: json - 
执行命令示例
Plain Text1aws s3api list-buckets --endpoint-url https://s3.bj.bcebos.com 2aws s3api list-objects --bucket bucketname --endpoint-url https://s3.bj.bcebos.com - 参考文档
 
https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-welcome.html
Hadoop S3A工具
S3A为hadoop官方提供的在hadoop系统中使用S3的工具包。通过S3A您可以像操作hdfs一样操作S3存储,目前BOS已经支持大部分常用的S3A的功能。关于S3A更为详细的介绍可参见:S3 Support in Apache Hadoop和Hadoop-AWS module: Integration with Amazon Web Services。
- 
下载依赖包
- 
确保hadoop系统中存在如下依赖包:
 - 
使用BOS适用版Jar包:由于目前BOS对S3的部分功能暂未实现兼容,为了更好的产品体验,请务必使用该定制版的工具包。
BOS适用版Jar包 MD5码 hadoop-aws-2.8.0.jar 6ffbdc9352b9399e169005aeeb09ee98  
 - 
 - 
修改S3A的相关配置
Plain Text1<property> 2 <name>fs.s3a.endpoint</name> 3 <value>http://s3.bj.bcebos.com</value> 4</property> 5<property> 6 <name>fs.s3a.signing-algorithm</name> 7 <value>AWSS3V4SignerType</value> 8</property> 9<!-- 开启bos后端接口--> 10<property> 11 <name>fs.s3a.bos.compat.access</name> 12 <value>true</value> 13</property> 14<property> 15<name>fs.s3a.access.key</name> 16<value>$ACCESS_KEY</value> 17</property> 18<property> 19<name>fs.s3a.secret.key</name> 20<value>$SECRET_KEY</value> 21</property> - 
使用S3A在hadoop环境下体验BOS服务,执行命令
Plain Text1hadoop fs -ls s3a://$YOUR_BUCKET_NAME - 
如何自行修改hadoop-aws
(1). 下载hadoop源码 (2). 修改hadoop-2.8.0-src/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ClientFactory.java文件。由于BOS暂未支持Chunked Upload,而Java sdk默认写操作会使用该方式,所以需要在代码中关闭sdk的这个操作。
Plain Text1// 找到s3a使用的aws client,例如: 2// 在createAmazonS3Client函数下的 3// AmazonS3 s3 = new AmazonS3Client(credentials, awsConf); 4// 为aws client添加禁止chunkedencoding 5s3.setS3ClientOptions(new S3ClientOptions().withChunkedEncodingDisabled(true));(3). 编译得到jar包
Plain Text1cd hadoop-2.8.0-src/hadoop-tools/hadoop-aws 2mvn package 
CloudBerry Explorer for Amazon S3
CloudBerry Explorer for Amazon S3是CloudBerry Lab提供的S3图形化数据管理软件,图形化界面非常强大,支持数据从本地到云端的上传、下载、同步、删除等管理操作。
现在百度智能云也支持通过CloudBerry Explorer来管理BOS上的资源了!
- 
选择和您操作系统相匹配的软件版本,下载后安装软件。
说明: 该软件支持免费和付费两个版本。您可以直接使用免费版本。
 - 
安装完成后先配置云存储,选择“S3 Compatible”:

 - 
添加新的S3 Compatible的账号,其中显示名可以填写BOS,服务点即BOS的服务域名,访问密钥和密码Key填写百度智能云的AK/SK。AWS S3兼容服务域名暂时不支持HTTPS,请去掉“使用SSL”的勾选,Signature version选4。

 - 点击“测试连接”测试连通性,显示“Connection success”表示连通成功。
 - 
下面就可以使用CloudBerry Explorer进行各种数据管理啦!

 
