当前位置: 首页 > news >正文

公司seo推广营销网站网络营销推广总结

公司seo推广营销网站,网络营销推广总结,色块设计网站,工程承包去哪个网站之前用pytorch构建了squeezenet#xff0c;个人觉得pytorch是最好用的#xff0c;但是有的工程就是需要caffe结构的#xff0c;所以本篇也用caffe构建一个squeezenet网络。 数据处理 首先要对数据进行处理#xff0c;跟pytorch不同#xff0c;pytorch读取数据只需要给数据… 之前用pytorch构建了squeezenet个人觉得pytorch是最好用的但是有的工程就是需要caffe结构的所以本篇也用caffe构建一个squeezenet网络。 数据处理 首先要对数据进行处理跟pytorch不同pytorch读取数据只需要给数据集所在目录即可直接从中读取数据而caffe需要一个包含每张图片的绝对路径以及所在类别的txt文件从中读取数据。写一个生成次txt文件的脚本 import os import randomfolder cotta # 数据集目录相对路径 names os.listdir(folder)f1 open(/train_txt/train_cotta.txt, a) # 生成的txt地址 f2 open(/train_txt/test_water_workcloth.txt, a)for name in names:imgnames os.listdir(folder / name)random.shuffle(imgnames)numimg len(imgnames)for i in range(numimg):f1.write(%s %s\n % (folder / name / imgnames[i], name[0]))# if i int(0.9*numimg):# f1.write(%s %s\n%(folder / name / imgnames[i], name[0]))# else:# f2.write(%s %s\n%(folder / name / imgnames[i], name[0])) # f2.close() f1.close()数据集的目录也要跟pytorch的一致一个类的数据放在一个目录中目录名为类名。且脚本与该目录同级。 运行脚本后生成的txt内容如下 /cotta/0_other/0_1_391_572_68_68.jpg 0 /cotta/1_longSleeves/9605_1_5_565_357_82_70.jpg 1 /cotta/2_cotta/713_0.99796_1_316_162_96_87.jpg 2 ...... 图片相对路径 图片所属类别网络结构配置文件 trainval.prototxt layer {name: datatype: ImageDatatop: datatop: labeltransform_param {mirror: truecrop_size: 96}image_data_param {source: /train_txt/train_cotta.txt # 生成的txt的相对路径root_folder: /data/ # 存放数据集目录的路径batch_size: 64shuffle: truenew_height: 96new_width: 96}} layer {name: conv1type: Convolutionbottom: datatop: conv1convolution_param {num_output: 96kernel_size: 3stride: 1pad: 1weight_filler {type: xavier}} }layer { name: BatchNorm1 type: BatchNorm bottom: conv1 top: BatchNorm1 }layer {name: relu_conv1type: ReLUbottom: BatchNorm1top: BatchNorm1 } layer {name: pool1type: Poolingbottom: BatchNorm1top: pool1pooling_param {pool: MAXkernel_size: 2stride: 2} } layer {name: fire2/squeeze1x1type: Convolutionbottom: pool1top: fire2/squeeze1x1convolution_param {num_output: 16kernel_size: 1weight_filler {type: xavier}} }layer { name: fire2/bn_squeeze1x1 type: BatchNorm bottom: fire2/squeeze1x1 top: fire2/bn_squeeze1x1 }layer {name: fire2/relu_squeeze1x1type: ReLUbottom: fire2/bn_squeeze1x1top: fire2/bn_squeeze1x1 } layer {name: fire2/expand1x1type: Convolutionbottom: fire2/bn_squeeze1x1top: fire2/expand1x1convolution_param {num_output: 64kernel_size: 1weight_filler {type: xavier}} }layer { name: fire2/bn_expand1x1 type: BatchNorm bottom: fire2/expand1x1 top: fire2/bn_expand1x1 }layer {name: fire2/relu_expand1x1type: ReLUbottom: fire2/bn_expand1x1top: fire2/bn_expand1x1 } layer {name: fire2/expand3x3type: Convolutionbottom: fire2/bn_expand1x1top: fire2/expand3x3convolution_param {num_output: 64pad: 1kernel_size: 3weight_filler {type: xavier}} }layer { name: fire2/bn_expand3x3 type: BatchNorm bottom: fire2/expand3x3 top: fire2/bn_expand3x3 }layer {name: fire2/relu_expand3x3type: ReLUbottom: fire2/bn_expand3x3top: fire2/bn_expand3x3 } layer {name: fire2/concattype: Concatbottom: fire2/bn_expand1x1bottom: fire2/bn_expand3x3top: fire2/concat }#fire2 ends: 128 channels layer {name: fire3/squeeze1x1type: Convolutionbottom: fire2/concattop: fire3/squeeze1x1convolution_param {num_output: 16kernel_size: 1weight_filler {type: xavier}} }layer { name: fire3/bn_squeeze1x1 type: BatchNorm bottom: fire3/squeeze1x1 top: fire3/bn_squeeze1x1 }layer {name: fire3/relu_squeeze1x1type: ReLUbottom: fire3/bn_squeeze1x1top: fire3/bn_squeeze1x1 } layer {name: fire3/expand1x1type: Convolutionbottom: fire3/bn_squeeze1x1top: fire3/expand1x1convolution_param {num_output: 64kernel_size: 1weight_filler {type: xavier}} }layer { name: fire3/bn_expand1x1 type: BatchNorm bottom: fire3/expand1x1 top: fire3/bn_expand1x1 }layer {name: fire3/relu_expand1x1type: ReLUbottom: fire3/bn_expand1x1top: fire3/bn_expand1x1 } layer {name: fire3/expand3x3type: Convolutionbottom: fire3/bn_expand1x1top: fire3/expand3x3convolution_param {num_output: 64pad: 1kernel_size: 3weight_filler {type: xavier}} }layer { name: fire3/bn_expand3x3 type: BatchNorm bottom: fire3/expand3x3 top: fire3/bn_expand3x3 }layer {name: fire3/relu_expand3x3type: ReLUbottom: fire3/bn_expand3x3top: fire3/bn_expand3x3 } layer {name: fire3/concattype: Concatbottom: fire3/bn_expand1x1bottom: fire3/bn_expand3x3top: fire3/concat }#fire3 ends: 128 channelslayer {name: bypass_23type: Eltwisebottom: fire2/concatbottom: fire3/concattop: fire3_EltAdd }layer {name: fire4/squeeze1x1type: Convolutionbottom: fire3_EltAddtop: fire4/squeeze1x1convolution_param {num_output: 32kernel_size: 1weight_filler {type: xavier}} }layer { name: fire4/bn_squeeze1x1 type: BatchNorm bottom: fire4/squeeze1x1 top: fire4/bn_squeeze1x1 }layer {name: fire4/relu_squeeze1x1type: ReLUbottom: fire4/bn_squeeze1x1top: fire4/bn_squeeze1x1 } layer {name: fire4/expand1x1type: Convolutionbottom: fire4/bn_squeeze1x1top: fire4/expand1x1convolution_param {num_output: 128kernel_size: 1weight_filler {type: xavier}} }layer { name: fire4/bn_expand1x1 type: BatchNorm bottom: fire4/expand1x1 top: fire4/bn_expand1x1 }layer {name: fire4/relu_expand1x1type: ReLUbottom: fire4/bn_expand1x1top: fire4/bn_expand1x1 } layer {name: fire4/expand3x3type: Convolutionbottom: fire4/bn_expand1x1top: fire4/expand3x3convolution_param {num_output: 128pad: 1kernel_size: 3weight_filler {type: xavier}} }layer { name: fire4/bn_expand3x3 type: BatchNorm bottom: fire4/expand3x3 top: fire4/bn_expand3x3 }layer {name: fire4/relu_expand3x3type: ReLUbottom: fire4/bn_expand3x3top: fire4/bn_expand3x3 } layer {name: fire4/concattype: Concatbottom: fire4/bn_expand1x1bottom: fire4/bn_expand3x3top: fire4/concat } #fire4 ends: 256 channelslayer {name: pool4type: Poolingbottom: fire4/concattop: pool4pooling_param {pool: MAXkernel_size: 2stride: 2} } #fire4 ends: 256 channels / pooled layer {name: fire5/squeeze1x1type: Convolutionbottom: pool4top: fire5/squeeze1x1convolution_param {num_output: 32kernel_size: 1weight_filler {type: xavier}} }layer { name: fire5/bn_squeeze1x1 type: BatchNorm bottom: fire5/squeeze1x1 top: fire5/bn_squeeze1x1 }layer {name: fire5/relu_squeeze1x1type: ReLUbottom: fire5/bn_squeeze1x1top: fire5/bn_squeeze1x1 } layer {name: fire5/expand1x1type: Convolutionbottom: fire5/bn_squeeze1x1top: fire5/expand1x1convolution_param {num_output: 128kernel_size: 1weight_filler {type: xavier}} }layer { name: fire5/bn_expand1x1 type: BatchNorm bottom: fire5/expand1x1 top: fire5/bn_expand1x1 }layer {name: fire5/relu_expand1x1type: ReLUbottom: fire5/bn_expand1x1top: fire5/bn_expand1x1 } layer {name: fire5/expand3x3type: Convolutionbottom: fire5/bn_expand1x1top: fire5/expand3x3convolution_param {num_output: 128pad: 1kernel_size: 3weight_filler {type: xavier}} }layer { name: fire5/bn_expand3x3 type: BatchNorm bottom: fire5/expand3x3 top: fire5/bn_expand3x3 }layer {name: fire5/relu_expand3x3type: ReLUbottom: fire5/bn_expand3x3top: fire5/bn_expand3x3 } layer {name: fire5/concattype: Concatbottom: fire5/bn_expand1x1bottom: fire5/bn_expand3x3top: fire5/concat }#fire5 ends: 256 channels layer {name: bypass_45type: Eltwisebottom: pool4bottom: fire5/concattop: fire5_EltAdd }layer {name: fire6/squeeze1x1type: Convolutionbottom: fire5_EltAddtop: fire6/squeeze1x1convolution_param {num_output: 48kernel_size: 1weight_filler {type: xavier}} }layer { name: fire6/bn_squeeze1x1 type: BatchNorm bottom: fire6/squeeze1x1 top: fire6/bn_squeeze1x1 }layer {name: fire6/relu_squeeze1x1type: ReLUbottom: fire6/bn_squeeze1x1top: fire6/bn_squeeze1x1 } layer {name: fire6/expand1x1type: Convolutionbottom: fire6/bn_squeeze1x1top: fire6/expand1x1convolution_param {num_output: 192kernel_size: 1weight_filler {type: xavier}} }layer { name: fire6/bn_expand1x1 type: BatchNorm bottom: fire6/expand1x1 top: fire6/bn_expand1x1 }layer {name: fire6/relu_expand1x1type: ReLUbottom: fire6/bn_expand1x1top: fire6/bn_expand1x1 } layer {name: fire6/expand3x3type: Convolutionbottom: fire6/bn_expand1x1top: fire6/expand3x3convolution_param {num_output: 192pad: 1kernel_size: 3weight_filler {type: xavier}} }layer { name: fire6/bn_expand3x3 type: BatchNorm bottom: fire6/expand3x3 top: fire6/bn_expand3x3 }layer {name: fire6/relu_expand3x3type: ReLUbottom: fire6/bn_expand3x3top: fire6/bn_expand3x3 } layer {name: fire6/concattype: Concatbottom: fire6/bn_expand1x1bottom: fire6/bn_expand3x3top: fire6/concat } #fire6 ends: 384 channelslayer {name: fire7/squeeze1x1type: Convolutionbottom: fire6/concattop: fire7/squeeze1x1convolution_param {num_output: 48kernel_size: 1weight_filler {type: xavier}} }layer { name: fire7/bn_squeeze1x1 type: BatchNorm bottom: fire7/squeeze1x1 top: fire7/bn_squeeze1x1 }layer {name: fire7/relu_squeeze1x1type: ReLUbottom: fire7/bn_squeeze1x1top: fire7/bn_squeeze1x1 } layer {name: fire7/expand1x1type: Convolutionbottom: fire7/bn_squeeze1x1top: fire7/expand1x1convolution_param {num_output: 192kernel_size: 1weight_filler {type: xavier}} }layer { name: fire7/bn_expand1x1 type: BatchNorm bottom: fire7/expand1x1 top: fire7/bn_expand1x1 }layer {name: fire7/relu_expand1x1type: ReLUbottom: fire7/bn_expand1x1top: fire7/bn_expand1x1 } layer {name: fire7/expand3x3type: Convolutionbottom: fire7/bn_expand1x1top: fire7/expand3x3convolution_param {num_output: 192pad: 1kernel_size: 3weight_filler {type: xavier}} }layer { name: fire7/bn_expand3x3 type: BatchNorm bottom: fire7/expand3x3 top: fire7/bn_expand3x3 }layer {name: fire7/relu_expand3x3type: ReLUbottom: fire7/bn_expand3x3top: fire7/bn_expand3x3 } layer {name: fire7/concattype: Concatbottom: fire7/bn_expand1x1bottom: fire7/bn_expand3x3top: fire7/concat } #fire7 ends: 384 channels layer {name: bypass_67type: Eltwisebottom: fire6/concatbottom: fire7/concattop: fire7_EltAdd }layer {name: fire8/squeeze1x1type: Convolutionbottom: fire7_EltAddtop: fire8/squeeze1x1convolution_param {num_output: 64kernel_size: 1weight_filler {type: xavier}} }layer { name: fire8/bn_squeeze1x1 type: BatchNorm bottom: fire8/squeeze1x1 top: fire8/bn_squeeze1x1 }layer {name: fire8/relu_squeeze1x1type: ReLUbottom: fire8/bn_squeeze1x1top: fire8/bn_squeeze1x1 } layer {name: fire8/expand1x1type: Convolutionbottom: fire8/bn_squeeze1x1top: fire8/expand1x1convolution_param {num_output: 256kernel_size: 1weight_filler {type: xavier}} }layer { name: fire8/bn_expand1x1 type: BatchNorm bottom: fire8/expand1x1 top: fire8/bn_expand1x1 }layer {name: fire8/relu_expand1x1type: ReLUbottom: fire8/bn_expand1x1top: fire8/bn_expand1x1 } layer {name: fire8/expand3x3type: Convolutionbottom: fire8/bn_expand1x1top: fire8/expand3x3convolution_param {num_output: 256pad: 1kernel_size: 3weight_filler {type: xavier}} }layer { name: fire8/bn_expand3x3 type: BatchNorm bottom: fire8/expand3x3 top: fire8/bn_expand3x3 }layer {name: fire8/relu_expand3x3type: ReLUbottom: fire8/bn_expand3x3top: fire8/bn_expand3x3 } layer {name: fire8/concattype: Concatbottom: fire8/bn_expand1x1bottom: fire8/bn_expand3x3top: fire8/concat } #fire8 ends: 512 channelslayer {name: pool8type: Poolingbottom: fire8/concattop: pool8pooling_param {pool: MAXkernel_size: 2stride: 2} } #fire8 ends: 512 channels layer {name: fire9/squeeze1x1type: Convolutionbottom: pool8top: fire9/squeeze1x1convolution_param {num_output: 64kernel_size: 1weight_filler {type: xavier}} }layer { name: fire9/bn_squeeze1x1 type: BatchNorm bottom: fire9/squeeze1x1 top: fire9/bn_squeeze1x1 }layer {name: fire9/relu_squeeze1x1type: ReLUbottom: fire9/bn_squeeze1x1top: fire9/bn_squeeze1x1 } layer {name: fire9/expand1x1type: Convolutionbottom: fire9/bn_squeeze1x1top: fire9/expand1x1convolution_param {num_output: 256kernel_size: 1weight_filler {type: xavier}} }layer { name: fire9/bn_expand1x1 type: BatchNorm bottom: fire9/expand1x1 top: fire9/bn_expand1x1 }layer {name: fire9/relu_expand1x1type: ReLUbottom: fire9/bn_expand1x1top: fire9/bn_expand1x1 } layer {name: fire9/expand3x3type: Convolutionbottom: fire9/bn_expand1x1top: fire9/expand3x3convolution_param {num_output: 256pad: 1kernel_size: 3weight_filler {type: xavier}} }layer { name: fire9/bn_expand3x3 type: BatchNorm bottom: fire9/expand3x3 top: fire9/bn_expand3x3 }layer {name: fire9/relu_expand3x3type: ReLUbottom: fire9/bn_expand3x3top: fire9/bn_expand3x3 } layer {name: fire9/concattype: Concatbottom: fire9/bn_expand1x1bottom: fire9/bn_expand3x3top: fire9/concat } #fire9 ends: 512 channelslayer {name: conv10_newtype: Convolutionbottom: fire9/concattop: conv10convolution_param {num_output: 3kernel_size: 1weight_filler {type: gaussianmean: 0.0std: 0.01}} }layer {name: pool10type: Poolingbottom: conv10top: pool10pooling_param {pool: AVEglobal_pooling: true} }# loss, top1, top5 layer {name: losstype: SoftmaxWithLossbottom: pool10bottom: labeltop: lossinclude { # phase: TRAIN} } layer {name: accuracytype: Accuracybottom: pool10bottom: labeltop: accuracy#include {# phase: TEST#} } 在最后一层卷积层conv10中的num_output修改类别数量。 模型超参配置文件 solver.prototxt test_iter: 2000 #not subject to iter_size test_interval: 1000000 # base_lr: 0.0001 base_lr: 0.005 # 学习率 display: 40 # max_iter: 600000 max_iter: 200000 # 迭代数 iter_size: 2 #global batch size batch_size * iter_size lr_policy: poly power: 1.0 #linearly decrease LR momentum: 0.9 weight_decay: 0.0002 snapshot: 10000 # 每多少次迭代保存一个模型 snapshot_prefix: /data/zxc/classfication/model/model_cotta/cotta_ # 模型保存路径 solver_mode: GPU random_seed: 42 net: ./trainNets_drive/trainval.prototxt # 网络结构配置文件的路径 test_initialization: false average_loss: 40max_itercaffe用的是迭代数而不是pytorch的轮数。pytorch中训练完全部的训练集为一轮而caffe中训练完一个batch_size的数据为一个迭代。如果想要等价与轮数的话一轮就等于len(train_data) / batch_size。如果有余数就要看pytorch里的dataloader里面设置舍去还是为一个batch如果舍去就是向下取整如果不舍去就是向上取整snapshot_prefix最后一部分为每个保存模型的前缀如图 运行命令 将运行命令写入bash文件中 train.sh /home/seg/anaconda3/envs/zxc/bin/caffe train -gpu 1 -solver ./solvers/solver_3.prototxt -weights/data/classfication/model/model_cotta/cotta__iter_200000.caffemodel 21 | tee log_3_4_class.txt -gpu选择哪块卡如果就一块就是0-solver后面跟网络超参配置文件路径-weights后面跟预训练模型可以用官方给的squeezenet的caffe版本的预训练模型我这里是训练中断从断点继续训练 编写完成后source activate 环境名称进入source环境然后source train.sh运行bash文件就能开始训练。
http://www.pierceye.com/news/755867/

相关文章:

  • 外语网站建设怎么知道网站的ftp
  • 苏州专业做网站的公司有哪些网络机柜定制
  • 提供服务的网站免费的进销存软件哪个简单好用
  • 长沙县政务网站网络公司名字大全寓意
  • 网站后台凡科建设有做网站维护的
  • 搭建网站需要什么软件上海在线
  • led灯外贸网站建设网站代码怎么优化
  • 网站建设维护什么意思江苏网络推广专员
  • 潍坊网站开发asp培训珠海市网站建设公司
  • 用什么做响应式网站建行个人余额查询网站
  • 做网站网站代理怎么找客源企业团建公司
  • 电子商务网站开发实战济南兼职做网站
  • 怎样创建网站视频学历提升的重要性
  • 百度搜索引擎录入网站1_ 掌握网站开发的基本流程 要求:熟悉网站开发与设计的基本流程.
  • 广州做网站建设如何在别人网站挂黑链
  • 宁德北京网站建设任丘建设银行网站
  • 积极加强网站建设连锁会员管理系统
  • 河南做外贸网站的公司简介wordpress做教育网站
  • 兴城做网站推广的企业网站后台管理软件
  • 自定义优定软件网站建设申请永久网站空间
  • 免费发布信息的网站平台怎么做网站统计
  • 制作网站的过程是对信息的龙海市住房和城乡建设局网站
  • 鱼台县建设局网站免费ppt模板制作软件
  • 质量好网站建设多少钱黄冈网站建设哪家快些
  • 使用阿里云部署wordpressseo搜索排名影响因素主要有
  • 大连制作网站建站教程图解
  • 百度的合作网站有哪些网站建设费用写创意
  • 建设个人网站ip护肤品网页设计图片
  • 德州网站建设优化金阳龙泉苑网站建设
  • 建站公司最新价格网站素材网