-
Notifications
You must be signed in to change notification settings - Fork 0
Asber777/RobotLearning
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
libcnn是文章给出的cnn ROS库 github link :https://github.com/libcnn/libcnn RAM_LAB link:https://www.ram-lab.com/reconstruction-3d/ The library has a light dependancy list: 1.Eigen vesion 3, a C++ template library for linear algebra: matrices, vectors, numerical solvers, and related algorithms. 2.boost version 1.49, peer-reviewed portable C++ source libraries. The library was compiled on Ubuntu --- lmdb_source_data是数据集 深度图像和对应的label, rgbd-human-explore是RAM_LAB提供的bag文件 文中的网络结构以及对图像的预处理: 原本的depth map size from Kinect is 640 * 480, downsampled to 1/4,that is ,160*120( for reduce computational cost) then downsampled depth map is put into 3-stage 'convolution + activation + pooling' that is: **The first** convolution layer uses 32 convolution kernels of size 5 x5, followed by a ReLU layer and a 2 x2 pooling layer with stride 2. **The second stage** of convolution+ activation + pooling is the same as the first stage . **For the third stage**, 64 convolutional kernels of size 5 x are used, with no change of the ReLU layer and pooling layer . This results in 64 feature maps of size 20 x15. The fully connected layer is made up of five nodes. The last layer represents the scoring of each output state. The control commands consist of five states: one for going straightforward, two for turning left and two for turning right as previously mentioned. The final decision is calculated by applying the soft-max function to the scores of the five possible states. 结果就是3-stage的网络 第一层 32个5*5大小的卷积核 followed by 一个ReLU激活层(现在貌似有更好的)和一个2*2的池化(步长为2) 第二层和第一层一样 第三层是64个5*5大小的卷积核,没有激活和池化,最后64个featuremao的大小是 20*15 最后全链接到5个node,然后soft-max一下这样。 --- 实验记录: 第一次实验是《AvoidanceLearning(1) 忘记归一化》 实验使用CPU训练,训练时间比较久,2 8 分测试集和训练集,有随机采样。 print len(train_data),len(test_data) 880 220 error = nn.CrossEntropyLoss() optimizer = torch.optim.SGD(cnn.parameters(),lr=0.0001) ('accuracy:', 0.18181818181818182) ('accuracy:', 0.35) ('accuracy:', 0.35909090909090907) ('accuracy:', 0.35909090909090907) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.35454545454545455) ('accuracy:', 0.36363636363636365) ('accuracy:', 0.36363636363636365) ('accuracy:', 0.36363636363636365) ('accuracy:', 0.36818181818181817) ('accuracy:', 0.37272727272727274) ('accuracy:', 0.38181818181818183) ('accuracy:', 0.38636363636363635) ('accuracy:', 0.37727272727272726) ('accuracy:', 0.37727272727272726) ('accuracy:', 0.4) ('accuracy:', 0.40454545454545454) ('accuracy:', 0.4090909090909091) ('accuracy:', 0.4090909090909091) ('accuracy:', 0.4090909090909091) ('accuracy:', 0.42272727272727273) ('accuracy:', 0.42727272727272725) ('accuracy:', 0.41818181818181815) ('accuracy:', 0.41363636363636364) ('accuracy:', 0.4090909090909091) ('accuracy:', 0.40454545454545454) ('accuracy:', 0.41818181818181815) ('accuracy:', 0.41818181818181815) ('accuracy:', 0.40454545454545454) ('accuracy:', 0.4) ('accuracy:', 0.4) ('accuracy:', 0.42272727272727273) ('accuracy:', 0.4090909090909091) ('accuracy:', 0.4) ('accuracy:', 0.41363636363636364) ('accuracy:', 0.4409090909090909) ('accuracy:', 0.44545454545454544) ('accuracy:', 0.4318181818181818) ('accuracy:', 0.42272727272727273) ('accuracy:', 0.4318181818181818) ('accuracy:', 0.4409090909090909) ('accuracy:', 0.4590909090909091) ('accuracy:', 0.44545454545454544) ('accuracy:', 0.43636363636363634) ('accuracy:', 0.4409090909090909) ('accuracy:', 0.45454545454545453) ('accuracy:', 0.4636363636363636) ('accuracy:', 0.4590909090909091) ('accuracy:', 0.4590909090909091) ('accuracy:', 0.45) ('accuracy:', 0.4590909090909091) ('accuracy:', 0.4590909090909091) ('accuracy:', 0.4681818181818182) ('accuracy:', 0.4590909090909091) ('accuracy:', 0.4636363636363636) ('accuracy:', 0.4590909090909091) ('accuracy:', 0.4590909090909091) ('accuracy:', 0.4727272727272727) ('accuracy:', 0.4727272727272727) ('accuracy:', 0.4636363636363636) ('accuracy:', 0.45454545454545453) ('accuracy:', 0.4772727272727273) ('accuracy:', 0.4818181818181818) ('accuracy:', 0.4772727272727273) ('accuracy:', 0.4727272727272727) ('accuracy:', 0.4727272727272727) ('accuracy:', 0.4863636363636364) ('accuracy:', 0.4727272727272727) ('accuracy:', 0.4863636363636364) ('accuracy:', 0.4818181818181818) ('accuracy:', 0.4954545454545455) ('accuracy:', 0.509090909090909) ('accuracy:', 0.4909090909090909) ('accuracy:', 0.5) ('accuracy:', 0.5) ('accuracy:', 0.5045454545454545) ('accuracy:', 0.5181818181818182) ('accuracy:', 0.509090909090909) ('accuracy:', 0.509090909090909) ('accuracy:', 0.5045454545454545) ('accuracy:', 0.5136363636363637) ('accuracy:', 0.5181818181818182) ('accuracy:', 0.5136363636363637) ('accuracy:', 0.5136363636363637) ('accuracy:', 0.5045454545454545) ('accuracy:', 0.5181818181818182) ('accuracy:', 0.5227272727272727) ('accuracy:', 0.5227272727272727) ('accuracy:', 0.5227272727272727) ('accuracy:', 0.5045454545454545) ('accuracy:', 0.5181818181818182) ('accuracy:', 0.5272727272727272) ('accuracy:', 0.5318181818181819) ('accuracy:', 0.5272727272727272) ('accuracy:', 0.5227272727272727) ('accuracy:', 0.5272727272727272) ('accuracy:', 0.5181818181818182) ('accuracy:', 0.5272727272727272) ('accuracy:', 0.5227272727272727) ('accuracy:', 0.5272727272727272) ('accuracy:', 0.5227272727272727) ('accuracy:', 0.5181818181818182) ('accuracy:', 0.5363636363636364) ('accuracy:', 0.5272727272727272) ('accuracy:', 0.5272727272727272) ('accuracy:', 0.5318181818181819) ('accuracy:', 0.5227272727272727) ('accuracy:', 0.5409090909090909) ('accuracy:', 0.5272727272727272) ('accuracy:', 0.5227272727272727) ('accuracy:', 0.5318181818181819) ('accuracy:', 0.5318181818181819) ('accuracy:', 0.5454545454545454) ('accuracy:', 0.5272727272727272) ('accuracy:', 0.5318181818181819) ('accuracy:', 0.5363636363636364) ('accuracy:', 0.5363636363636364) ('accuracy:', 0.5409090909090909) ('accuracy:', 0.5318181818181819) ('accuracy:', 0.5318181818181819) ('accuracy:', 0.5318181818181819) ('accuracy:', 0.5409090909090909) ('accuracy:', 0.55) ('accuracy:', 0.5409090909090909) ('accuracy:', 0.5318181818181819) ('accuracy:', 0.5409090909090909) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.55) ('accuracy:', 0.5318181818181819) ('accuracy:', 0.5318181818181819) ('accuracy:', 0.5409090909090909) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.55) ('accuracy:', 0.5363636363636364) ('accuracy:', 0.5363636363636364) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5454545454545454) ('accuracy:', 0.5363636363636364) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5454545454545454) ('accuracy:', 0.5363636363636364) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5772727272727273) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.55) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.55) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.55) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5772727272727273) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5772727272727273) ('accuracy:', 0.5545454545454546) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5772727272727273) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5590909090909091) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5818181818181818) ('accuracy:', 0.5772727272727273) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5863636363636363) ('accuracy:', 0.5818181818181818) ('accuracy:', 0.5636363636363636) ('accuracy:', 0.5772727272727273) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5909090909090909) ('accuracy:', 0.5863636363636363) ('accuracy:', 0.5681818181818182) ('accuracy:', 0.5772727272727273) ('accuracy:', 0.5727272727272728) ('accuracy:', 0.5909090909090909) ('accuracy:', 0.5909090909090909)
About
reproduction of paper<Autonomous exploration of mobile robots through deep neural networks>
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published