﻿<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:trackback="http://madskills.com/public/xml/rss/module/trackback/" xmlns:wfw="http://wellformedweb.org/CommentAPI/" xmlns:slash="http://purl.org/rss/1.0/modules/slash/"><channel><title>C++博客-杰-随笔分类-学术</title><link>http://www.cppblog.com/guijie/category/13861.html</link><description>杰哥好,哈哈!</description><language>zh-cn</language><lastBuildDate>Tue, 21 Jul 2020 11:34:57 GMT</lastBuildDate><pubDate>Tue, 21 Jul 2020 11:34:57 GMT</pubDate><ttl>60</ttl><item><title>[zz] 灵魂调参师被 AutoGluon 打爆，李沐：调得一手好参的时代要过去了</title><link>http://www.cppblog.com/guijie/archive/2020/01/22/217096.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Wed, 22 Jan 2020 15:25:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2020/01/22/217096.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/217096.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2020/01/22/217096.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/217096.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/217096.html</trackback:ping><description><![CDATA[<section powered-by="xiumi.us"><section><section><h5><span style="font-size: 15px;">日前，亚马逊AWS推出AutoGluon，只需三行代码自动生成高性能模型，让调参、神经架构搜索等过程实现自动化。</span><span style="font-size: 15px;">一位调参师亲自试了一番，他手工调的faster rcnn就被AutoGluon通过NAS自动搜索的yolo打爆了整整6个点。</span><span style="font-size: 15px;">这也让李沐大神感慨：</span><span style="font-size: 15px;">调得一手好参的时代要过去了。</span><span style="font-size: 15px;">戳右边链接上&nbsp;&nbsp;了解更多！</span></h5><section style="margin-right: 8px; margin-left: 8px;"><br /></section></section></section></section><section style="margin-right: 8px; margin-left: 8px;">近日，亚马逊宣布推出AutoGluon，这是一个新的开源库，开发人员可以使用该库构建包含图像、文本或表格数据集的机器学习应用程序。使用AutoGluon，只需编写几行代码就可以利用深度学习的力量来构建应用程序。</section><section style="margin-right: 8px; margin-left: 8px;"><br /></section><section style="margin-right: 8px; margin-left: 8px;">已经有不少朋友按耐不住内心的激动开始尝鲜，这不，一位名叫&#8220;Justin ho&#8221;的&#8220;灵魂调参师&#8221;就用自己的亲身经历给我们讲了一个深夜鬼故事：</section><section style="margin-right: 8px; margin-left: 8px;"><br /></section><p style="text-align: center; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; background-color: #ffffff;"><mp-miniprogram data-miniprogram-servicetype="0" data-miniprogram-type="card" data-miniprogram-imageurl="http://mmbiz.qpic.cn/mmbiz_jpg/UicQ7HgWiaUb0FauUSft0wnNgicqViakP2ZIlFXs3ic9zstnibq5Pyy4oh2nz1rud9g0n8YjRH1RFCHkCaZgcGc4ic79Q/0?wx_fmt=jpeg" data-miniprogram-title="点击进入获取原帖地址，围观网友热议" data-miniprogram-avatar="http://mmbiz.qpic.cn/mmbiz_png/AmfodDeU6icIfHWvLfwtQfAsLqiaAhzpqwV6K2duARgsEus2ibtj4FMX0P4eLHJDIZXvSBrez7uTPU9AgWb6t9Q0g/640?wx_fmt=png&amp;wxfrom=200" data-miniprogram-nickname="新智元" data-miniprogram-path="package-media/pages/comment-list/index?fid=17055600335677876091&amp;isFromShare=1&amp;inviter=undefined" data-miniprogram-appid="wxb3ea1fb095410fb9"></mp-miniprogram><span js_weapp_display_element"=""><span app_context="" pages_reset="" appmsg_card_context=""  appmsg_card_active"="">     <span flex_context"=""><span weapp_card_avatar"=""><img src="http://mmbiz.qpic.cn/mmbiz_png/AmfodDeU6icIfHWvLfwtQfAsLqiaAhzpqwV6K2duARgsEus2ibtj4FMX0P4eLHJDIZXvSBrez7uTPU9AgWb6t9Q0g/640?wx_fmt=png&amp;wxfrom=200" alt="" />             <span flex_bd"="">新智元</span></span>         点击进入获取原帖地址，围观网友热议</span>Mini Program     </span></span></p><section style="margin-right: 8px; margin-left: 8px;"><span style="font-size: 15px;">他手工调的faster rcnn（resnet50 backbone）就被AutoGluon通过NAS自动搜索的yolo（mobilenet backbone）打爆了整整6个点。</span></section><section style="margin-right: 8px; margin-left: 8px;"><br /></section><section style="margin-right: 8px; margin-left: 8px;">这也让李沐大神感慨，调得一手好参的时代要过去了：</section><section style="margin-right: 8px; margin-left: 8px;"><br /><img js_insertlocalimg=""  "="" style="text-align: center; width: 677px !important; height: auto !important; visibility: visible !important;" src="https://mmbiz.qpic.cn/mmbiz_jpg/UicQ7HgWiaUb0FauUSft0wnNgicqViakP2ZI0odibDPJQ10CYtXPnahNsRwfUlHM1UQURAibjibNqvUmU9Snic3EhfNZSA/640?wx_fmt=jpeg&amp;wxfrom=5&amp;wx_lazy=1&amp;wx_co=1" crossorigin="anonymous" data-type="jpeg" data-w="1230" data-src="https://mmbiz.qpic.cn/mmbiz_jpg/UicQ7HgWiaUb0FauUSft0wnNgicqViakP2ZI0odibDPJQ10CYtXPnahNsRwfUlHM1UQURAibjibNqvUmU9Snic3EhfNZSA/640?wx_fmt=jpeg" data-s="300,640" data-ratio="0.8780487804878049" _width="677px" data-fail="0" alt="" /></section><section style="margin-right: 8px; margin-left: 8px;"><br /></section><section style="margin-right: 8px; margin-left: 8px;"><span style="font-size: 15px;">知乎网友纷纷感慨，调参法师要失业了：</span><br /></section><section style="margin-right: 8px; margin-left: 8px;"><br /></section><section style="text-align: center; margin-right: 8px; margin-left: 8px;"><img js_insertlocalimg=""  "="" style="width: 626px !important; height: auto !important; visibility: visible !important;" src="https://mmbiz.qpic.cn/mmbiz_png/UicQ7HgWiaUb0FauUSft0wnNgicqViakP2ZIILevc8AcMT9Hpf48icmzHgTU8ZnudXyVliaFPaFqaBibHXiaDlic8xQPn0w/640?wx_fmt=png&amp;wxfrom=5&amp;wx_lazy=1&amp;wx_co=1" crossorigin="anonymous" data-type="png" data-w="626" data-src="https://mmbiz.qpic.cn/mmbiz_png/UicQ7HgWiaUb0FauUSft0wnNgicqViakP2ZIILevc8AcMT9Hpf48icmzHgTU8ZnudXyVliaFPaFqaBibHXiaDlic8xQPn0w/640?wx_fmt=png" data-s="300,640" data-ratio="0.1869009584664537" _width="626px" data-fail="0" alt="" /></section><section style="margin-right: 8px; margin-left: 8px;"><br /></section><section style="margin-right: 8px; margin-left: 8px;"><span style="font-size: 15px;">接下来，我们为大家介绍一下AutoGluon。</span></section><section style="margin-right: 8px; margin-left: 8px;"><br /></section><section data-style-type="0" autoid="1895"><section style="border-color: #ffca00; padding: 8px; line-height: 1.4; font-family: inherit; font-size: 18px; font-weight: bold; text-decoration: inherit; border-left-width: 6px; border-left-style: solid; max-width: 100%; box-sizing: border-box; overflow-wrap: break-word;"><section style="line-height: 1.75em; margin-right: 8px; margin-left: 8px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><span style="letter-spacing: 0.5px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">AutoGluon：用更少的代码，打造更高的SOTA性能模型</span></section></section></section><section style="text-align: center; letter-spacing: 0.54px; font-family: -apple-system-font, system-ui, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><br /></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, system-ui, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">深度学习是机器学习的一部分，深度学习模型是受到人脑结构的启发而生成的。深度学习算法通常包含不少层，用于学习输入数据的有用表示。比如，在面向图像识别的深度学习模型中，较低的层可检测更多的基本特征（颜色或边缘），而较高的层可用于识别更复杂的特征（如数字或对象）。</span></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, system-ui, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><br /></span></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, system-ui, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><img "="" style="width: 677px !important; height: auto !important; visibility: visible !important;" src="https://mmbiz.qpic.cn/mmbiz_png/UicQ7HgWiaUb1HrCFKrS1A85nslL0j9NaOAuicpoiaCMH2UuClAB2QyeicEqjMRouk9pV57D7HCqpDjZwXcPSwB8TLg/640?wx_fmt=png&amp;wxfrom=5&amp;wx_lazy=1&amp;wx_co=1" crossorigin="anonymous" data-type="png" data-w="1280" data-src="https://mmbiz.qpic.cn/mmbiz_png/UicQ7HgWiaUb1HrCFKrS1A85nslL0j9NaOAuicpoiaCMH2UuClAB2QyeicEqjMRouk9pV57D7HCqpDjZwXcPSwB8TLg/640?wx_fmt=png" data-s="300,640" data-ratio="0.5625" _width="677px" data-fail="0" alt="" /></span></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, system-ui, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">一般来讲，部署深度学习模型，实现最先进的性能需要广泛的专业知识。目前来看深度学习的应用仍主要局限在有限数量的专家中，但在过去十年里，为了简化深度学习应用难度，降低门槛，让机器学习为更多的技术专业人员方便使用，研究人员的努力已经收到了明显成效。</span></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, system-ui, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;">&nbsp;</section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, system-ui, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">比如，开发人员早先必须投入大量时间和专业知识，来计算训练深度学习模型所需的梯度。梯度是向量，可以标识出效果最好的参数更新，最大程度地减少训练数据中实例的错误。像Theano这样的软件库甚至可以自动计算高度复杂的神经网络的梯度，让开发人员通过样板代码使用日益复杂的神经体系结构。</span></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, system-ui, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;">&nbsp;</section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, system-ui, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><strong style="max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">像Keras这样的较新的库代表了深度学习普适化的又一次进步。</span></strong><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">它允许开发人员指定参数，比如输入数量，深度学习模型的层数，甚至仅需几行代码就能定义一个网络层，从而简化了现有库中的大量样板代码，而这部分代码在现在的库中是必不可少的。</span></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, system-ui, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><br /></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">不过，即使有了这些进步，今天的深度学习专家和开发人员仍必须解决许多麻烦的问题，包括超参数调整、数据预处理、神经体系结构搜索以及迁移学习有关的决策。</span><br style="max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;" /></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><br /></section><ul style="margin-right: 8px; margin-left: 8px;"><li><p style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><span style="color: #ff6827; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><strong style="max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">超参数调整</span></strong></span><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">涉及到如何选择神经网络中的层数、如何连接这些层（即网络的体系结构）以及如何训练网络。</span></p><p style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><br style="max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;" /></p></li><li><p style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><span style="color: #ff6827; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><strong style="max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">数据处理，</span></strong></span><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">包括数据分类和正确格式化矢量的数据预处理也可能是一个非常麻烦的过程。</span></p></li></ul><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><br /></section><ul style="margin-right: 8px; margin-left: 8px;"><li><p style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><span style="color: #ff6827; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><strong style="max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">神经架构搜索</span></strong></span><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">涉及自动化架构工程，能够使开发人员为其机器学习模型找到最佳设计。所有这些决定都需要相当多的专业知识，提高了深度学习的门槛。</span></p></li></ul><p><br /></p><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><br /></section><section data-style-type="0" autoid="1895"><section style="border-color: #ffca00; padding: 8px; line-height: 1.4; font-family: inherit; font-size: 18px; font-weight: bold; text-decoration: inherit; border-left-width: 6px; border-left-style: solid; max-width: 100%; box-sizing: border-box; overflow-wrap: break-word;"><section style="line-height: 1.75em; margin-right: 8px; margin-left: 8px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">三行代码自动生成高性能模型，让人工决策自动化</section></section></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><br /></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><strong style="max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">AutoGluon为开发人员自动化了许多决策，开发人员只需三行代码即可生成高性能的神经网络模型！</span></strong></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><br /></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">开发人员无需在设计深度学习模型时手动尝试必须做出的数百种选择，只需简单地指定让训练好的模型就绪的时间即可。作为响应，AutoGluon可以利用可用的计算资源在其分配的运行时间内找到最强大的模型。</span></section><section style="text-align: center; line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, system-ui, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">AutoGluon可以通过自动调整默认范围内的选择来生成仅需三行代码的模型，而这些默认范围在已知范围内可以很好地完成给定任务。开发人员只需指定他们何时准备好训练后的模型，AutoGluon就会利用可用的计算资源来在分配的runtime中找到最强大的模型。</span></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><br /></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">Mueller说：&#8220;由于深度学习固有的不透明性，深度学习专家做出的许多选择都是基于特殊的直觉，没有很严格的科学指导规范。AutoGluon解决了这个问题，因为所有选择都会在默认范围内自动调整，而且默认范围对于特定任务和模型表现良好。&#8221;</span></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><br /></section><section data-style-type="0" autoid="1895"><section style="border-color: #ffca00; padding: 8px; line-height: 1.4; font-family: inherit; font-size: 18px; font-weight: bold; text-decoration: inherit; border-left-width: 6px; border-left-style: solid; max-width: 100%; box-sizing: border-box; overflow-wrap: break-word;"><section style="line-height: 1.75em; margin-right: 8px; margin-left: 8px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><span style="letter-spacing: 0.5px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">官网指南：AutoGluon快速上手</span><br style="max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;" /></section></section></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><br /></section><section style="text-align: center; letter-spacing: 0.54px; font-family: -apple-system-font, system-ui, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><img "="" style="width: 677px !important; height: auto !important; visibility: visible !important; box-sizing: border-box !important; overflow-wrap: break-word;" src="https://mmbiz.qpic.cn/mmbiz_png/UicQ7HgWiaUb1HrCFKrS1A85nslL0j9NaOYUTmRh7yjVyaEhFiaujJ5BQ3IU1EpPEn8YwugUl76roKFZjjQkIXvcQ/640?wx_fmt=png&amp;wxfrom=5&amp;wx_lazy=1&amp;wx_co=1" crossorigin="anonymous" data-type="png" data-w="1192" data-src="https://mmbiz.qpic.cn/mmbiz_png/UicQ7HgWiaUb1HrCFKrS1A85nslL0j9NaOYUTmRh7yjVyaEhFiaujJ5BQ3IU1EpPEn8YwugUl76roKFZjjQkIXvcQ/640?wx_fmt=png" data-s="300,640" data-ratio="0.5360738255033557" _width="677px" data-fail="0" alt="" /></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">AutoGluon官方网站为开发人员提供了许多教程，可利用它们对表格，文本和图像数据进行深度学习（诸如涵盖分类/回归等基本任务以及对象检测等更高级的任务）。</span><br style="max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;" /></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><br /></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><span style="letter-spacing: 0.5px; font-size: 15px; max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;">对于经验丰富的开发人员，AutoGluon网站提供有关如何使用AutoGluon API来自动提高定制应用程序中的预测性能的自定义说明。</span></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, system-ui, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><strong style="max-width: 100%; box-sizing: border-box !important; overflow-wrap: break-word;"><span style="color: #888888; letter-spacing: 0.5px; max-width: 100%; overflow-wrap: break-word; box-sizing: border-box !important;">关于<span style="font-size: 15px; max-width: 100%;">Auto</span><span style="font-size: 15px; max-width: 100%;">gluon</span>简明使用教程和官方安装指南等更多信息，请见官网：</span></strong></section><section style="line-height: 1.75em; letter-spacing: 0.54px; font-family: -apple-system-font, system-ui, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; margin-right: 8px; margin-left: 8px; min-height: 1em; max-width: 100%; background-color: #ffffff; overflow-wrap: break-word; box-sizing: border-box !important;"><span style="color: #888888; letter-spacing: 0.5px; max-width: 100%; overflow-wrap: break-word; box-sizing: border-box !important;">https://autogluon.mxnet.io/<br />Reference:<br /><div>https://mp.weixin.qq.com/s/pUnQAfVgXQUJqJeqMKOCrw</div><br /></span></section><img src ="http://www.cppblog.com/guijie/aggbug/217096.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2020-01-22 23:25 <a href="http://www.cppblog.com/guijie/archive/2020/01/22/217096.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>[zz] 谷歌获批GAN专利，一整套对抗训练网络被收入囊中 </title><link>http://www.cppblog.com/guijie/archive/2020/01/22/217095.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Wed, 22 Jan 2020 15:08:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2020/01/22/217095.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/217095.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2020/01/22/217095.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/217095.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/217095.html</trackback:ping><description><![CDATA[<p><strong><span style="text-align: justify; letter-spacing: 0.5px;">作者 | 十、年</span></strong><br /></p><p><strong><span style="text-align: justify; letter-spacing: 0.5px;">编辑 |&nbsp;<strong style="font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; font-size: 15px; background-color: #ffffff; caret-color: #000000;"><span style="color: #333333;">Camel</span></strong><strong style="font-family: -apple-system-font, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei UI&quot;, &quot;Microsoft YaHei&quot;, Arial, sans-serif; font-size: 15px; background-color: #ffffff; caret-color: #000000;"></strong></span></strong></p><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">谷歌获得了&#8220;对抗训练神经网络&#8221;专利。</span><br /></section><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">根据FPO（免费专利在线）信息显示，此项专利申请于2016年的9月份，生效于2019年的12月31日。</span></section><p><img "="" style="width: 100% !important; height: auto !important; visibility: visible !important;" src="https://mmbiz.qpic.cn/mmbiz_png/vJe7ErxcLmhtTyb91p26jyqGcrpV1TOQBQ50GSBz4pjfoXNzDvecxvyB9JHfVDzh4OtHIic7dbQI343YO8QdI6A/640?wx_fmt=png&amp;wxfrom=5&amp;wx_lazy=1&amp;wx_co=1" crossorigin="anonymous" data-type="png" data-w="1305" data-src="https://mmbiz.qpic.cn/mmbiz_png/vJe7ErxcLmhtTyb91p26jyqGcrpV1TOQBQ50GSBz4pjfoXNzDvecxvyB9JHfVDzh4OtHIic7dbQI343YO8QdI6A/640?wx_fmt=png" data-s="300,640" data-ratio="0.49731800766283524" _width="100%" data-fail="0" data-backw="574" data-backh="285"  alt="" /></p><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">其中，发明人为Ian J. Goodfellow，Szegedy, Christian。谷歌作为受让人拥有专利权，这意味着继神经网络 Dropout 专利之后，又一构建神经网络的基础方法归属于谷歌。</span><br /></section><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">根据该专利的声明，保护条款有14条。其中第一条便指出这是一种用来确定神经网络参数的方法，在接下来的条款中详细介绍了神经网络对抗训练的过程，涉及到了数据处理、模型训练等等。也就是说使用对抗训练方法中的目标函数，迭代方法都是受法律保护的。</span></section><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">换句话说，你如果使用对抗训练神经网络可能存在着付费的风险。</span></section><p><img "="" style="width: 100% !important; height: auto !important; visibility: visible !important;" src="https://mmbiz.qpic.cn/mmbiz_png/vJe7ErxcLmhtTyb91p26jyqGcrpV1TOQCsf7FakfQEqSrvUvviaGrM9UESXdC4QHyZ2jdfT74Yb0pbRrd5icRxdA/640?wx_fmt=png&amp;wxfrom=5&amp;wx_lazy=1&amp;wx_co=1" crossorigin="anonymous" data-type="png" data-w="1289" data-src="https://mmbiz.qpic.cn/mmbiz_png/vJe7ErxcLmhtTyb91p26jyqGcrpV1TOQCsf7FakfQEqSrvUvviaGrM9UESXdC4QHyZ2jdfT74Yb0pbRrd5icRxdA/640?wx_fmt=png" data-s="300,640" data-ratio="0.2808378588052754" _width="100%" data-fail="0" data-backw="574" data-backh="161"  alt="" /></p><p style="text-align: center; line-height: normal; margin-top: 15px; margin-bottom: 15px;"><span style="color: #888888; letter-spacing: 0.5px;">专利的其他参考</span><br /></p><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">另外值得注意的是，这项专利不仅包括GAN（	生成式对抗网络）。根据专利声明，其在申请中所用词语为&#8220;方法&#8221;，&#8220;系统&#8221;，这意味着此项专利是用来解决某一类机器学习问题，而不是一个。另外，专利页面也标明了此项专利的其他参考来源不仅仅局限于</span><span style="letter-spacing: 0.5px; font-size: 15px;">Goodfellow的《 Generative Adversarial Nets》。</span></section><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;"><br /></span></section><h2><section><span style="font-size: 18px;"><strong><span style="color: #021eaa;">何为对抗训练</span></strong></span></section></h2><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">对抗训练神经网络最著名便是GAN，即生成式对抗网络，主要用在图像技术方面的图像生成和自然语言方面的生成式对话内容。</span></section><section style="text-align: center; line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="color: #888888; letter-spacing: 0.5px;"><img "="" style="width: 100% !important; height: auto !important; visibility: visible !important;" src="https://mmbiz.qpic.cn/mmbiz_png/vJe7ErxcLmhtTyb91p26jyqGcrpV1TOQXMo4SiaPEmF0ywicG2hF6U1LRY8abuqLbicMOfVwiaYRuicKKD7X6XhXbMg/640?wx_fmt=png&amp;wxfrom=5&amp;wx_lazy=1&amp;wx_co=1" crossorigin="anonymous" data-type="png" data-w="792" data-src="https://mmbiz.qpic.cn/mmbiz_png/vJe7ErxcLmhtTyb91p26jyqGcrpV1TOQXMo4SiaPEmF0ywicG2hF6U1LRY8abuqLbicMOfVwiaYRuicKKD7X6XhXbMg/640?wx_fmt=png" data-ratio="0.42045454545454547" _width="100%" data-fail="0" data-backw="574" data-backh="241"  alt="" /></span></section><section style="text-align: center; line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="color: #888888; letter-spacing: 0.5px;">生成对抗网络框架</span></section><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">作为一种深度学习模型，GAN是近年来复杂分布上无监督学习最具前景的方法之一。最初是Ian J. Goodfellow等人于2014年10月在&#8220;Generative Adversarial Networks	&#8221;中提出了的一个通过对抗过程估计生成模型的新框架，此框架能够使得训练模型的数据更具效益。</span></section><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">在GAN设置中，两个由神经网络（生成器和鉴别器）在这个框架中要扮演不同的角色。生成器试图生成来自某种概率分布的数据；鉴别器就像一个法官。它可以决定输入是来自生成器还是来自真正的训练集。</span></section><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">例如在图像生成中如果生成器构造的图像不够好，那么鉴别器就传达一个负反馈给生成器，于是生成器根据反馈调整自身参数，让下一次生成的图片质量得以提升，它就是靠这种体内自循环的方式不断提升自己构造图片的能力。其运行过程类似于武侠小说《射雕英雄传》中，王重阳的师弟周伯通所使用的&#8220;左右互博&#8221;之术。</span></section><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;"><br /></span></section><h2><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><strong><span style="color: #021eaa; letter-spacing: 0.5px; font-size: 18px;">有何影响？</span></strong><strong><span style="color: #021eaa; letter-spacing: 0.5px; font-size: 18px;">只是谷歌的自我防御？</span></strong></section></h2><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">谷歌这项专利在Reddit 论坛上有人提出忧虑，也有人相当乐观觉得没啥大不了的。</span></section><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><img "="" style="width: 100% !important; height: auto !important; visibility: visible !important;" src="https://mmbiz.qpic.cn/mmbiz_png/vJe7ErxcLmhtTyb91p26jyqGcrpV1TOQ7PfRhXaER8TGnXdpqENyfJkDCpZexagz9NfMpsicl9wUIHzSYdicmc0w/640?wx_fmt=png&amp;wxfrom=5&amp;wx_lazy=1&amp;wx_co=1" crossorigin="anonymous" data-type="png" data-w="767" data-src="https://mmbiz.qpic.cn/mmbiz_png/vJe7ErxcLmhtTyb91p26jyqGcrpV1TOQ7PfRhXaER8TGnXdpqENyfJkDCpZexagz9NfMpsicl9wUIHzSYdicmc0w/640?wx_fmt=png" data-ratio="0.1421121251629726" _width="100%" data-fail="0" data-backw="574" data-backh="82"  alt="" /></section><blockquote data-source-title="" data-content-utf8-length="43" data-author-name="" data-url="" data-type="2"><section><section><span style="letter-spacing: 0.5px; font-size: 15px;">哇！多谢Goodfellow，这非常酷，为一个极其宽泛的概念申请专利肯定不会扼杀创新。</span></section></section></blockquote><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;"><img "="" style="width: 100% !important; height: auto !important; visibility: visible !important;" src="https://mmbiz.qpic.cn/mmbiz_png/vJe7ErxcLmhtTyb91p26jyqGcrpV1TOQiaOswH830zoibfcpUnicIBttMNHAMloVssrSrq3FRyCrEWCZFTCHtcZiaA/640?wx_fmt=png&amp;wxfrom=5&amp;wx_lazy=1&amp;wx_co=1" crossorigin="anonymous" data-type="png" data-w="710" data-src="https://mmbiz.qpic.cn/mmbiz_png/vJe7ErxcLmhtTyb91p26jyqGcrpV1TOQiaOswH830zoibfcpUnicIBttMNHAMloVssrSrq3FRyCrEWCZFTCHtcZiaA/640?wx_fmt=png" data-ratio="0.1563380281690141" _width="100%" data-fail="0" data-backw="574" data-backh="90"  alt="" /></span></section><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">有的意味深长的表示，这涵盖了神经网络的对抗训练，即针对鲁棒性，而不是一般的GAN。</span></section><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;"><img "="" style="width: 100% !important; height: auto !important; visibility: visible !important;" src="https://mmbiz.qpic.cn/mmbiz_png/vJe7ErxcLmhtTyb91p26jyqGcrpV1TOQley92h6QTZfVHdo3GHh87yGUc2ZaEenq9jG4sCDf98TmywOTodWVkQ/640?wx_fmt=png&amp;wxfrom=5&amp;wx_lazy=1&amp;wx_co=1" crossorigin="anonymous" data-type="png" data-w="826" data-src="https://mmbiz.qpic.cn/mmbiz_png/vJe7ErxcLmhtTyb91p26jyqGcrpV1TOQley92h6QTZfVHdo3GHh87yGUc2ZaEenq9jG4sCDf98TmywOTodWVkQ/640?wx_fmt=png" data-ratio="0.3353510895883777" _width="100%" data-fail="0" data-backw="574" data-backh="192"  alt="" /></span></section><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">还有网友质疑是否公平！甚至将问题引至了政治！</span></section><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">不过根据谷歌针对Dropout专利的态度，短时间内应该不会有风险，正如之前Jeff在Google 日本举行的传媒会议中回应的那样，只是为了避免不必要麻烦，保护公司利益而做，并不为借专利技术赚钱，开发人员毋须担心。</span></section><section style="line-height: 1.75em; margin-top: 15px; margin-bottom: 15px;"><span style="letter-spacing: 0.5px; font-size: 15px;">但是对于我们中国现状，自主知识产权的底层框架和核心算法缺乏，更多依靠开源代码和算法的情况下。谷歌一系列专利获批，不仅关乎科研，还关乎更致命的自主核心算法和背后的&#8220;卡脖子&#8221;困境。<br />Reference:<br /></span><div>https://mp.weixin.qq.com/s/oNSJGJIaJlCEs4LT6aUL5g</div></section><img src ="http://www.cppblog.com/guijie/aggbug/217095.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2020-01-22 23:08 <a href="http://www.cppblog.com/guijie/archive/2020/01/22/217095.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>[zz]读China Daily学英文</title><link>http://www.cppblog.com/guijie/archive/2019/06/27/216463.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Wed, 26 Jun 2019 17:41:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2019/06/27/216463.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/216463.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2019/06/27/216463.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/216463.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/216463.html</trackback:ping><description><![CDATA[<span style="color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; background-color: #ffffff;">目前不少人对China Daily存在偏见：报纸上的文章都是中国人写的，因此不够地道。但这种偏见完全站不住脚，因为中国日报的记者都是资深英文写手，写作水平要远超绝大多数读者，而且文章发表前还有外籍专家润色审稿</span><span style="color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; background-color: #ffffff;">。</span><br />Reference:&nbsp;<a href="https://zhuanlan.zhihu.com/p/49847636">https://zhuanlan.zhihu.com/p/49847636</a><img src ="http://www.cppblog.com/guijie/aggbug/216463.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2019-06-27 01:41 <a href="http://www.cppblog.com/guijie/archive/2019/06/27/216463.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>[zz] 为什么说图像的低频是轮廓，高频是噪声和细节</title><link>http://www.cppblog.com/guijie/archive/2019/06/19/216424.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Tue, 18 Jun 2019 17:01:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2019/06/19/216424.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/216424.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2019/06/19/216424.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/216424.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/216424.html</trackback:ping><description><![CDATA[<div><font color="#0000ee">图像的频率：灰度值变化剧烈程度的指标，是灰度在平面空间上的梯度。</font></div><div><font color="#0000ee"><br /></font></div><div><font color="#0000ee">（1）什么是低频?</font></div><div><font color="#0000ee">&nbsp; &nbsp; &nbsp; 低频就是颜色缓慢地变化,也就是灰度缓慢地变化,就代表着那是连续渐变的一块区域,这部分就是低频. 对于一幅图像来说，除去高频的就是低频了，也就是边缘以内的内容为低频，而边缘内的内容就是图像的大部分信息，即图像的大致概貌和轮廓，是图像的近似信息。</font></div><div><font color="#0000ee"><br /></font></div><div><font color="#0000ee">（2）什么是高频?</font></div><div><font color="#0000ee"><br /></font></div><div><font color="#0000ee">&nbsp; &nbsp; &nbsp;反过来, 高频就是频率变化快.图像中什么时候灰度变化快?就是相邻区域之间灰度相差很大,这就是变化得快.图像中,一个影像与背景的边缘部位,通常会有明显的差别,也就是说变化那条边线那里,灰度变化很快,也即是变化频率高的部位.因此，图像边缘的灰度值变化快，就对应着频率高，即高频显示图像边缘。图像的细节处也是属于灰度值急剧变化的区域，正是因为灰度值的急剧变化，才会出现细节。</font></div><div><font color="#0000ee">&nbsp; &nbsp; &nbsp; 另外噪声（即噪点）也是这样,在一个像素所在的位置,之所以是噪点,就是因为它与正常的点颜色不一样了，也就是说该像素点灰度值明显不一样了,,也就是灰度有快速地变化了,所以是高频部分，因此有噪声在高频这么一说。</font></div><div><font color="#0000ee"><br /></font></div><div><font color="#0000ee">&nbsp; &nbsp; &nbsp; 其实归根到底,是因为我们人眼识别物体就是这样的.假如你穿一个红衣服在红色背景布前拍照,你能很好地识别么?不能,因为衣服与背景融为一体了,没有变化,所以看不出来,除非有灯光从某解度照在人物身上,这样边缘处会出现高亮和阴影,这样我们就能看到一些轮廓线,这些线就是颜色（即灰度）很不一样的地方.</font></div><div><font color="#0000ee">---------------------&nbsp;</font></div><div><font color="#0000ee">作者：charlene_bo&nbsp;</font></div><div><font color="#0000ee">来源：CSDN&nbsp;</font></div><div><font color="#0000ee">原文：https://blog.csdn.net/charlene_bo/article/details/70877999&nbsp;</font></div><div><font color="#0000ee">版权声明：本文为博主原创文章，转载请附上博文链接！</font></div><img src ="http://www.cppblog.com/guijie/aggbug/216424.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2019-06-19 01:01 <a href="http://www.cppblog.com/guijie/archive/2019/06/19/216424.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>谷歌学术影响力排名(2020 Scholar Metrics)</title><link>http://www.cppblog.com/guijie/archive/2019/05/17/216377.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Fri, 17 May 2019 14:59:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2019/05/17/216377.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/216377.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2019/05/17/216377.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/216377.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/216377.html</trackback:ping><description><![CDATA[<div style="text-align: center; "><strong>谷歌学术影响力排名(2020 Scholar Metrics)</strong></div><p style="background:#F8F8F8"><span lang="EN-US" style="font-size:10.5pt;mso-bidi-font-size:
11.0pt;font-family:&quot;Calibri&quot;,sans-serif;mso-ascii-theme-font:minor-latin;
mso-fareast-font-family:宋体;mso-fareast-theme-font:minor-fareast;mso-hansi-theme-font:
minor-latin;mso-bidi-font-family:&quot;Times New Roman&quot;;mso-bidi-theme-font:minor-bidi;
mso-ansi-language:EN-US;mso-fareast-language:ZH-CN;mso-bidi-language:AR-SA"><a href="https://scholar.googleblog.com/2020/07/2020-scholar-metrics-released.html">https://scholar.googleblog.com/2020/07/2020-scholar-metrics-released.html</a><br /><span style="color: #333333; font-family: Arial, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; background-color: #ffffff;"><br /></span></span></p><div style="text-align: center; "><strong>谷歌学术影响力排名(2019 Scholar Metrics)</strong></div><p style="background:#F8F8F8"><span lang="EN-US" style="font-size:10.5pt;mso-bidi-font-size:
11.0pt;font-family:&quot;Calibri&quot;,sans-serif;mso-ascii-theme-font:minor-latin;
mso-fareast-font-family:宋体;mso-fareast-theme-font:minor-fareast;mso-hansi-theme-font:
minor-latin;mso-bidi-font-family:&quot;Times New Roman&quot;;mso-bidi-theme-font:minor-bidi;
mso-ansi-language:EN-US;mso-fareast-language:ZH-CN;mso-bidi-language:AR-SA"><span style="color: #333333; font-family: Arial, &quot;PingFang SC&quot;, &quot;Hiragino Sans GB&quot;, &quot;Microsoft YaHei&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; background-color: #ffffff;">2019年谷歌学者Top Publication榜单更新：CVPR上升到整个工程与计算机科学类第二名。人工智能领域NeurIPS, ICLR, ICML独占鳌头。自然语言处理榜单没有变化，还是ACL，EMNLP，NAACL。 &#8203;&#8203;&#8203;&#8203;<br /></span>Reference:<br /><a href="https://scholar.google.com/citations?view_op=top_venues&amp;hl=en&amp;vq=eng">https://scholar.google.com/citations?view_op=top_venues&amp;hl=en&amp;vq=eng<br /></a>Weibo of Weilian Wang on July 20, 2019. See my favourite on July 23, 2019.&nbsp;<br /><a href="https://m.sohu.com/a/245182179_473283/?pvid=000115_3w_a"><br />https://m.sohu.com/a/245182179_473283/?pvid=000115_3w_a</a>&nbsp;<span style="font-size: 10.5pt; font-family: verdana, &quot;courier new&quot;;">&nbsp; (read once)</span></span></p><div>2018谷歌学术影响力排名出炉：CVPR进入前20，ResNet被引最多过万次！</div><p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;"><br />来源：</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">scholar.google.com</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">作者：闻菲</span></p>  <p style="background:#F8F8F8"><strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">【新智元导读</span></strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">】谷歌学术昨天发表了</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">2018</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">年最新的学术期刊和会议影响力排名，</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">CVPR</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">和</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">NIPS</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">分别排名第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">20</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">和第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">54</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">。在排名第一的</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Nature</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">里，过去</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">5</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">年被引用次数最高的论文，正是深度学习三大神</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Hinton</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">、</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">LeCun</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">和</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Bengio</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">写的《深度学习》一文，而</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">CVPR</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">里被引次数最高的，则是</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">ResNet</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">，引用次数超过了</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">1</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">万次。</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">昨天，谷歌学术（</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Google Scholar</span><span style="font-size:13.0pt;font-family:宋体;color:#222222;">）公布了</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">2018</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">年最新的学术期刊</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">/</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">会议影响力排名，从综合领域看，毫不意外的，</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Nature</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">第一、</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Science</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">第三，但值得关注的是，<strong>计算机视觉顶会</strong></span><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">CVPR</span></strong><strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">排名第</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">20</span></strong><strong><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">，另一个</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">AI</span></strong><strong><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">领域的顶会</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">NIPS</span></strong><strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">也排名第</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">54</span></strong><strong><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">，</span></strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">名次较去年有了大幅提升。</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">就连排名第一的</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Nature</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">里，过去</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">5</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">年被引用次数最高的论文，也是</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">&#8220;</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">深度学习三大神</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">&#8221;Hinton</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">、</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">LeCun</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">和</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Bengio</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">合著的《深度学习》一文。</span></p>  <p style=" line-height:12.75pt;background:#F8F8F8"><img width="554" height="217" src="file:///C:/Users/think/AppData/Local/Temp/msohtmlclip1/01/clip_image002.jpg" v:shapes="Picture_x0020_1" alt="" /></p>  <p align="left" style="background: #f8f8f8;"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">不仅如此，在</span><span style="font-size:13.0pt; font-family:&quot;Arial&quot;,sans-serif;color:#222222;">CVPR</span><span style="font-size:13.0pt;font-family: 宋体;color:#222222;">里，过去</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">5</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">年被引次数最多的论文，是当时还在微软亚洲研究院的孙剑、何恺明、张祥雨、任少卿写的的</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">ResNet</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">，被引次数已经过万。</span> <img width="554" height="212" src="file:///C:/Users/think/AppData/Local/Temp/msohtmlclip1/01/clip_image004.jpg" v:shapes="Picture_x0020_19" alt="" /></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">2018 </span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">谷歌学术期刊和会议影响力排名：</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">CVPR</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">20</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">，</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">NIPS</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">54</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">首先来看综合领域结果。</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">大家比较关心的</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Nature</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">、</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Science</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">分别位列第一和第三，医学著名期刊《新英格兰杂志》和《柳叶刀》分别位于第二和第四。一向被国内与</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Nature</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">、</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Science</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">并列，有</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">&#8220;CNS&#8221;</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">之称的</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Cell</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">，这次排名第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">6</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">。</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">接下来就是新智元的读者更为关注的与人工智能有关的期刊和会议了，这一次，计算机视觉顶会</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">CVPR</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">不负众望排名第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">20</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">，由此计算机领域顶会也终于进入</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Top20</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">的行列。</span></p>  <p style=" line-height:12.75pt;background:#F8F8F8">&nbsp;<img width="554" height="462" src="file:///C:/Users/think/AppData/Local/Temp/msohtmlclip1/01/clip_image006.jpg" v:shapes="Picture_x0020_20" alt="" /></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">另一方面，</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">AI</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">领域另一个备受关注的会议</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">NIPS</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">，也在综合排名中位列第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">54</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">，取得了不错的成绩。</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">与神经科学相关的</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;"> Nature Neuroscience </span><span style="font-size:13.0pt;font-family:宋体;color:#222222;">排名第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">44</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">。</span></p>  <p style=" line-height:12.75pt;background:#F8F8F8">&nbsp;<img width="554" height="431" src="file:///C:/Users/think/AppData/Local/Temp/msohtmlclip1/01/clip_image008.jpg" v:shapes="Picture_x0020_21" alt="" /></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">至于第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">21</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">名到第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">40</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">名的期刊，实际上也有常有跟</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">AI</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">相关的论文发表，大家也可以看一下排名。</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">值得一提，</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">PLoS ONE</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">位于第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">23</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">，</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Scientific Reports </span><span style="font-size:13.0pt;font-family:宋体;color:#222222;">排名第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">39</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">，也算是不错的发表场所了。</span></p>  <p style=" line-height:12.75pt;background:#F8F8F8">&nbsp;<img width="554" height="428" src="file:///C:/Users/think/AppData/Local/Temp/msohtmlclip1/01/clip_image010.jpg" v:shapes="Picture_x0020_22" alt="" /></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">在第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">61</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">到第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">80</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">名中间，集中出现了多本</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">IEEE</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">期刊。被誉为另一个计算机视觉顶会的</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">ICCV</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">，排名第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">78</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">。</span></p>  <p style=" line-height:12.75pt;background:#F8F8F8">&nbsp;<img width="554" height="429" src="file:///C:/Users/think/AppData/Local/Temp/msohtmlclip1/01/clip_image012.jpg" v:shapes="Picture_x0020_23" alt="" /></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">81</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">到第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">100</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">名的期刊</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">/</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">会议排名如下，</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">TPAMI </span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">位于第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">92</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">，果然好论文都优先去会议发表了。</span></p>  <p style=" line-height:12.75pt;background:#F8F8F8">&nbsp;<img width="554" height="429" src="file:///C:/Users/think/AppData/Local/Temp/msohtmlclip1/01/clip_image014.jpg" v:shapes="Picture_x0020_24" alt="" /></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">工程与计算机领域</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Top 20</span><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">：</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">CVPR</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">排名第</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">5</span></p>  <p style=" line-height:12.75pt;background:#F8F8F8">&nbsp;<img width="554" height="482" src="file:///C:/Users/think/AppData/Local/Temp/msohtmlclip1/01/clip_image016.jpg" v:shapes="Picture_x0020_25" alt="" /></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">谷歌学术计量排名方法：过去</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">5</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">年被引用论文</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">&#8220;h5</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">指数</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">&#8221;</span></p>  <p style="background:#F8F8F8"><strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">谷歌学术（</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Google Scholar</span></strong><strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">）期刊和会议排名主要基于</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">h-index</span></strong><strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">。</span></strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">实际上，从</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">2012</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">年起来，谷歌学术计量（</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Google Scholar Metrics, GSM</span><span style="font-size:13.0pt;font-family:宋体;color:#222222;">）每年都会发布学术期刊和会议的</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">GSM</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">排名。</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">相比科睿唯安基于</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Web of Science</span><span style="font-size:13.0pt;font-family:宋体;color:#222222;">数据库公布的《期刊引证报告》（</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Journal Citation Report, JCR</span><span style="font-size:13.0pt;font-family:宋体;color:#222222;">），</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">GSM</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">不仅可以免费检索，而且收录的期刊和会议范围远远大于</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">Web of Science</span><span style="font-size:13.0pt;font-family:宋体;color:#222222;">。</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">还有一点，<strong>期刊</strong></span><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">/</span></strong><strong><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">会议的</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">&#8220;h5</span></strong><strong><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">指数</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">&#8221;</span></strong><strong><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">（过去</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">5</span></strong><strong><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">年</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">h-index</span></strong><strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">）</span></strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">比较难以被人为操控，不会因为多了一篇超高被引论文而明显增长，另一方面，刻意减少发文量也不会对提升</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">h5</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">指数有作用。</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">因此，</span><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">h5</span></strong><strong><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">指数可以体现期刊和会议的整体综合实力</span></strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">，逐渐成为学术出版物和会议影响力评价的一个重要参考。</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">总体看，</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">GSM</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">主要参考以下</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">3</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">个指标：</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">相应地，</span><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">h5</span></strong><strong><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">指数（</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">h5-index</span></strong><strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">）、</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">h5</span></strong><strong><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">核心（</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">h5-core</span></strong><strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">）和</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">h5</span></strong><strong><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">中值（</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">h5-median</span></strong><strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">），就是收录在谷歌学术系统中的期刊和会议在最近</span></strong><strong><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">5</span></strong><strong><span style="font-size: 13.0pt;font-family:宋体;color:#222222;">年的论文数量及各论文被引用的次数</span></strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">。</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">例如，如果某本期刊在过去</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">5</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">年所发表的论文中，至少有</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;"> h </span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">篇论文分别被引用了至少</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;"> h </span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">次，那么这份杂志的</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;"> h5</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">指数就是</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;"> h</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">。</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">h5</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">核心和</span><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">h5</span><span style="font-size:13.0pt; font-family:宋体;color:#222222;">中值的计算方法也一样。</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:宋体;color:#222222;">了解更多：</span></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">https://scholar.google.com/citations?view_op=top_venues&amp;hl=zh-CN&amp;vq=en</span></p>  <p style="background:#F8F8F8"><strong><span style="font-size:13.0pt;font-family:宋体;color:#222222;">开售！</span></strong></p>  <p style="background:#F8F8F8"><span style="font-size:13.0pt;font-family:&quot;Arial&quot;,sans-serif;color:#222222;">http://www.aiworld2018.com/</span></p>  <p align="left" style="line-height: 12.75pt; background: #f8f8f8;"><u><span style="font-family:&quot;Arial&quot;,sans-serif;color:#656565;background:#ECEDEF;"><a href="https://m.sohu.com/search?tags=%E8%B0%B7%E6%AD%8C&amp;spm=smwp.content.content.t-1.1557507641072lIyVcJE"><span style="font-family:宋体;color:#656565">谷歌</span></a><a href="https://m.sohu.com/search?tags=%E8%AE%A1%E7%AE%97%E6%9C%BA&amp;spm=smwp.content.content.t-2.1557507641072lIyVcJE"><span style="font-family:宋体;color:#656565">计算机</span></a></span></u></p>  <p align="left" style="background: #f8f8f8;"><span style="font-size:13.0pt; font-family:宋体;color:#222222;">声明：该文观点仅代表作者本人，搜狐号系信息发布平台，搜狐仅提供信息存储空间服务。</span></p><img src ="http://www.cppblog.com/guijie/aggbug/216377.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2019-05-17 22:59 <a href="http://www.cppblog.com/guijie/archive/2019/05/17/216377.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>[zz] How to Train a GAN? Tips and tricks to make GANs work</title><link>http://www.cppblog.com/guijie/archive/2019/04/02/216325.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Mon, 01 Apr 2019 21:42:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2019/04/02/216325.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/216325.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2019/04/02/216325.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/216325.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/216325.html</trackback:ping><description><![CDATA[<p style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;">While research in Generative Adversarial Networks (GANs) continues to improve the fundamental stability of these models, we use a bunch of tricks to train them and make them stable day to day.</p><p style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;">Here are a summary of some of the tricks.</p><p style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><a href="https://github.com/soumith/ganhacks#authors" style="box-sizing: border-box; background-color: transparent; color: #0366d6; text-decoration-line: none;">Here's a link to the authors of this document</a></p><p style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;">If you find a trick that is particularly useful in practice, please open a Pull Request to add it to the document. If we find it to be reasonable and verified, we will merge it in.</p><h2><a id="user-content-1-normalize-the-inputs" aria-hidden="true" href="https://github.com/soumith/ganhacks#1-normalize-the-inputs" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>1. Normalize the inputs</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">normalize the images between -1 and 1</li><li style="box-sizing: border-box; margin-top: 0.25em;">Tanh as the last layer of the generator output</li></ul><h2><a id="user-content-2-a-modified-loss-function" aria-hidden="true" href="https://github.com/soumith/ganhacks#2-a-modified-loss-function" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>2: A modified loss function</h2><p style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;">In GAN papers, the loss function to optimize G is&nbsp;<code style="box-sizing: border-box; font-family: SFMono-Regular, Consolas, &quot;Liberation Mono&quot;, Menlo, Courier, monospace; font-size: 13.6px; background-color: rgba(27, 31, 35, 0.05); border-radius: 3px; margin: 0px; padding: 0.2em 0.4em;">min (log 1-D)</code>, but in practice folks practically use&nbsp;<code style="box-sizing: border-box; font-family: SFMono-Regular, Consolas, &quot;Liberation Mono&quot;, Menlo, Courier, monospace; font-size: 13.6px; background-color: rgba(27, 31, 35, 0.05); border-radius: 3px; margin: 0px; padding: 0.2em 0.4em;">max log D</code></p><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">because the first formulation has vanishing gradients early on</li><li style="box-sizing: border-box; margin-top: 0.25em;">Goodfellow et. al (2014)</li></ul><p style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;">In practice, works well:</p><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">Flip labels when training generator: real = fake, fake = real</li></ul><h2><a id="user-content-3-use-a-spherical-z" aria-hidden="true" href="https://github.com/soumith/ganhacks#3-use-a-spherical-z" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>3: Use a spherical Z</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">Dont sample from a Uniform distribution</li></ul><p style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><a target="_blank" rel="noopener noreferrer" href="https://github.com/soumith/ganhacks/blob/master/images/cube.png" style="box-sizing: border-box; background-color: transparent; color: #0366d6; text-decoration-line: none;"><img src="https://github.com/soumith/ganhacks/raw/master/images/cube.png" alt="cube" title="Cube" style="box-sizing: content-box; border-style: none; max-width: 100%;" /></a></p><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">Sample from a gaussian distribution</li></ul><p style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><a target="_blank" rel="noopener noreferrer" href="https://github.com/soumith/ganhacks/blob/master/images/sphere.png" style="box-sizing: border-box; background-color: transparent; color: #0366d6; text-decoration-line: none;"><img src="https://github.com/soumith/ganhacks/raw/master/images/sphere.png" alt="sphere" title="Sphere" style="box-sizing: content-box; border-style: none; max-width: 100%;" /></a></p><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">When doing interpolations, do the interpolation via a great circle, rather than a straight line from point A to point B</li><li style="box-sizing: border-box; margin-top: 0.25em;">Tom White's&nbsp;<a href="https://arxiv.org/abs/1609.04468" rel="nofollow" style="box-sizing: border-box; background-color: transparent; color: #0366d6; text-decoration-line: none;">Sampling Generative Networks</a>&nbsp;ref code&nbsp;<a href="https://github.com/dribnet/plat" style="box-sizing: border-box; background-color: transparent; color: #0366d6; text-decoration-line: none;">https://github.com/dribnet/plat</a>&nbsp;has more details</li></ul><h2><a id="user-content-4-batchnorm" aria-hidden="true" href="https://github.com/soumith/ganhacks#4-batchnorm" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>4: BatchNorm</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">Construct different mini-batches for real and fake, i.e. each mini-batch needs to contain only all real images or all generated images.</li><li style="box-sizing: border-box; margin-top: 0.25em;">when batchnorm is not an option use instance normalization (for each sample, subtract mean and divide by standard deviation).</li></ul><p style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><a target="_blank" rel="noopener noreferrer" href="https://github.com/soumith/ganhacks/blob/master/images/batchmix.png" style="box-sizing: border-box; background-color: transparent; color: #0366d6; text-decoration-line: none;"><img src="https://github.com/soumith/ganhacks/raw/master/images/batchmix.png" alt="batchmix" title="BatchMix" style="box-sizing: content-box; border-style: none; max-width: 100%;" /></a></p><h2><a id="user-content-5-avoid-sparse-gradients-relu-maxpool" aria-hidden="true" href="https://github.com/soumith/ganhacks#5-avoid-sparse-gradients-relu-maxpool" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>5: Avoid Sparse Gradients: ReLU, MaxPool</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">the stability of the GAN game suffers if you have sparse gradients</li><li style="box-sizing: border-box; margin-top: 0.25em;">LeakyReLU = good (in both G and D)</li><li style="box-sizing: border-box; margin-top: 0.25em;">For Downsampling, use: Average Pooling, Conv2d + stride</li><li style="box-sizing: border-box; margin-top: 0.25em;">For Upsampling, use: PixelShuffle, ConvTranspose2d + stride<ul style="box-sizing: border-box; margin-bottom: 0px; margin-top: 0px; padding-left: 2em;"><li style="box-sizing: border-box;">PixelShuffle:&nbsp;<a href="https://arxiv.org/abs/1609.05158" rel="nofollow" style="box-sizing: border-box; background-color: transparent; color: #0366d6; text-decoration-line: none;">https://arxiv.org/abs/1609.05158</a></li></ul></li></ul><h2><a id="user-content-6-use-soft-and-noisy-labels" aria-hidden="true" href="https://github.com/soumith/ganhacks#6-use-soft-and-noisy-labels" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>6: Use Soft and Noisy Labels</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">Label Smoothing, i.e. if you have two target labels: Real=1 and Fake=0, then for each incoming sample, if it is real, then replace the label with a random number between 0.7 and 1.2, and if it is a fake sample, replace it with 0.0 and 0.3 (for example).<ul style="box-sizing: border-box; margin-bottom: 0px; margin-top: 0px; padding-left: 2em;"><li style="box-sizing: border-box;">Salimans et. al. 2016</li></ul></li><li style="box-sizing: border-box; margin-top: 0.25em;">make the labels the noisy for the discriminator: occasionally flip the labels when training the discriminator</li></ul><h2><a id="user-content-7-dcgan--hybrid-models" aria-hidden="true" href="https://github.com/soumith/ganhacks#7-dcgan--hybrid-models" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>7: DCGAN / Hybrid Models</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">Use DCGAN when you can. It works!</li><li style="box-sizing: border-box; margin-top: 0.25em;">if you cant use DCGANs and no model is stable, use a hybrid model : KL + GAN or VAE + GAN</li></ul><h2><a id="user-content-8-use-stability-tricks-from-rl" aria-hidden="true" href="https://github.com/soumith/ganhacks#8-use-stability-tricks-from-rl" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>8: Use stability tricks from RL</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">Experience Replay<ul style="box-sizing: border-box; margin-bottom: 0px; margin-top: 0px; padding-left: 2em;"><li style="box-sizing: border-box;">Keep a replay buffer of past generations and occassionally show them</li><li style="box-sizing: border-box; margin-top: 0.25em;">Keep checkpoints from the past of G and D and occassionaly swap them out for a few iterations</li></ul></li><li style="box-sizing: border-box; margin-top: 0.25em;">All stability tricks that work for deep deterministic policy gradients</li><li style="box-sizing: border-box; margin-top: 0.25em;">See Pfau &amp; Vinyals (2016)</li></ul><h2><a id="user-content-9-use-the-adam-optimizer" aria-hidden="true" href="https://github.com/soumith/ganhacks#9-use-the-adam-optimizer" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>9: Use the ADAM Optimizer</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">optim.Adam rules!<ul style="box-sizing: border-box; margin-bottom: 0px; margin-top: 0px; padding-left: 2em;"><li style="box-sizing: border-box;">See Radford et. al. 2015</li></ul></li><li style="box-sizing: border-box; margin-top: 0.25em;">Use SGD for discriminator and ADAM for generator</li></ul><h2><a id="user-content-10-track-failures-early" aria-hidden="true" href="https://github.com/soumith/ganhacks#10-track-failures-early" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>10: Track failures early</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">D loss goes to 0: failure mode</li><li style="box-sizing: border-box; margin-top: 0.25em;">check norms of gradients: if they are over 100 things are screwing up</li><li style="box-sizing: border-box; margin-top: 0.25em;">when things are working, D loss has low variance and goes down over time vs having huge variance and spiking</li><li style="box-sizing: border-box; margin-top: 0.25em;">if loss of generator steadily decreases, then it's fooling D with garbage (says martin)</li></ul><h2><a id="user-content-11-dont-balance-loss-via-statistics-unless-you-have-a-good-reason-to" aria-hidden="true" href="https://github.com/soumith/ganhacks#11-dont-balance-loss-via-statistics-unless-you-have-a-good-reason-to" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>11: Dont balance loss via statistics (unless you have a good reason to)</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">Dont try to find a (number of G / number of D) schedule to uncollapse training</li><li style="box-sizing: border-box; margin-top: 0.25em;">It's hard and we've all tried it.</li><li style="box-sizing: border-box; margin-top: 0.25em;">If you do try it, have a principled approach to it, rather than intuition</li></ul><p style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;">For example</p><pre style="box-sizing: border-box; font-family: SFMono-Regular, Consolas, &quot;Liberation Mono&quot;, Menlo, Courier, monospace; font-size: 13.6px; margin-bottom: 16px; margin-top: 0px; overflow-wrap: normal; background-color: #f6f8fa; border-radius: 3px; line-height: 1.45; overflow: auto; padding: 16px; color: #24292e;"><code style="box-sizing: border-box; font-family: SFMono-Regular, Consolas, &quot;Liberation Mono&quot;, Menlo, Courier, monospace; background: transparent; border-radius: 3px; margin: 0px; padding: 0px; border: 0px; word-break: normal; display: inline; line-height: inherit; overflow: visible; overflow-wrap: normal;">while lossD &gt; A:   train D while lossG &gt; B:   train G </code></pre><h2><a id="user-content-12-if-you-have-labels-use-them" aria-hidden="true" href="https://github.com/soumith/ganhacks#12-if-you-have-labels-use-them" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>12: If you have labels, use them</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">if you have labels available, training the discriminator to also classify the samples: auxillary GANs</li></ul><h2><a id="user-content-13-add-noise-to-inputs-decay-over-time" aria-hidden="true" href="https://github.com/soumith/ganhacks#13-add-noise-to-inputs-decay-over-time" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>13: Add noise to inputs, decay over time</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">Add some artificial noise to inputs to D (Arjovsky et. al., Huszar, 2016)<ul style="box-sizing: border-box; margin-bottom: 0px; margin-top: 0px; padding-left: 2em;"><li style="box-sizing: border-box;"><a href="http://www.inference.vc/instance-noise-a-trick-for-stabilising-gan-training/" rel="nofollow" style="box-sizing: border-box; background-color: transparent; color: #0366d6; text-decoration-line: none;">http://www.inference.vc/instance-noise-a-trick-for-stabilising-gan-training/</a></li><li style="box-sizing: border-box; margin-top: 0.25em;"><a href="https://openreview.net/forum?id=Hk4_qw5xe" rel="nofollow" style="box-sizing: border-box; background-color: transparent; color: #0366d6; text-decoration-line: none;">https://openreview.net/forum?id=Hk4_qw5xe</a></li></ul></li><li style="box-sizing: border-box; margin-top: 0.25em;">adding gaussian noise to every layer of generator (Zhao et. al. EBGAN)<ul style="box-sizing: border-box; margin-bottom: 0px; margin-top: 0px; padding-left: 2em;"><li style="box-sizing: border-box;">Improved GANs: OpenAI code also has it (commented out)</li></ul></li></ul><h2><a id="user-content-14-notsure-train-discriminator-more-sometimes" aria-hidden="true" href="https://github.com/soumith/ganhacks#14-notsure-train-discriminator-more-sometimes" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>14: [notsure] Train discriminator more (sometimes)</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">especially when you have noise</li><li style="box-sizing: border-box; margin-top: 0.25em;">hard to find a schedule of number of D iterations vs G iterations</li></ul><h2><a id="user-content-15-notsure-batch-discrimination" aria-hidden="true" href="https://github.com/soumith/ganhacks#15-notsure-batch-discrimination" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>15: [notsure] Batch Discrimination</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">Mixed results</li></ul><h2><a id="user-content-16-discrete-variables-in-conditional-gans" aria-hidden="true" href="https://github.com/soumith/ganhacks#16-discrete-variables-in-conditional-gans" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>16: Discrete variables in Conditional GANs</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">Use an Embedding layer</li><li style="box-sizing: border-box; margin-top: 0.25em;">Add as additional channels to images</li><li style="box-sizing: border-box; margin-top: 0.25em;">Keep embedding dimensionality low and upsample to match image channel size</li></ul><h2><a id="user-content-17-use-dropouts-in-g-in-both-train-and-test-phase" aria-hidden="true" href="https://github.com/soumith/ganhacks#17-use-dropouts-in-g-in-both-train-and-test-phase" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>17: Use Dropouts in G in both train and test phase</h2><ul style="box-sizing: border-box; margin-bottom: 16px; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff;"><li style="box-sizing: border-box;">Provide noise in the form of dropout (50%).</li><li style="box-sizing: border-box; margin-top: 0.25em;">Apply on several layers of our generator at both training and test time</li><li style="box-sizing: border-box; margin-top: 0.25em;"><a href="https://arxiv.org/pdf/1611.07004v1.pdf" rel="nofollow" style="box-sizing: border-box; background-color: transparent; color: #0366d6; text-decoration-line: none;">https://arxiv.org/pdf/1611.07004v1.pdf</a></li></ul><h2><a id="user-content-authors" aria-hidden="true" href="https://github.com/soumith/ganhacks#authors" style="box-sizing: border-box; color: #0366d6; text-decoration-line: none; float: left; line-height: 1; margin-left: -20px; padding-right: 4px;"><svg octicon-link"="" viewbox="0 0 16 16" version="1.1" width="16" height="16" aria-hidden="true"></svg></a>Authors</h2><ul style="box-sizing: border-box; margin-top: 0px; padding-left: 2em; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, &quot;Segoe UI&quot;, Helvetica, Arial, sans-serif, &quot;Apple Color Emoji&quot;, &quot;Segoe UI Emoji&quot;, &quot;Segoe UI Symbol&quot;; font-size: 16px; background-color: #ffffff; margin-bottom: 0px !important;"><li style="box-sizing: border-box;">Soumith Chintala</li><li style="box-sizing: border-box; margin-top: 0.25em;">Emily Denton</li><li style="box-sizing: border-box; margin-top: 0.25em;">Martin Arjovsky</li><li style="box-sizing: border-box; margin-top: 0.25em;">Michael Mathieu</li></ul>Reference:<br /><a href="https://github.com/soumith/ganhacks#authors">https://github.com/soumith/ganhacks#authors<br /><br /><br /><br /><h1>GAN的一些小trick<br /><br /><p style="margin: 0px 0px 1.4em; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">最近训练GAN遇到了很多坑，GAN的训练的确是个很dt的问题，如果只是用别人的paper跑一些应用还好，如果自己设计新的结构，做一些新的研究的话，就需要了解这些trick了，都是泪~</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">这个doc&nbsp;<a href="http://link.zhihu.com/?target=https%3A//github.com/soumith/ganhacks%23authors" wrap=""  external"="" target="_blank" rel="nofollow noreferrer" data-za-detail-view-id="1043" style="text-decoration-line: none; cursor: pointer; border-bottom: 1px solid grey;">soumith/ganhacks</a><a href="http://link.zhihu.com/?target=https%3A//github.com/soumith/ganhacks%23authors" wrap=""  external"="" target="_blank" rel="nofollow noreferrer" data-za-detail-view-id="1043" style="text-decoration-line: none; cursor: pointer; border-bottom: 1px solid grey;">soumith/ganhacks</a>&nbsp;简直是GAN武林界的九阴真经，看完以后感觉自己上了一个level。</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">自己做个笔记：</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">1。normalize输入，让它在[-1,1]。generater的输出用tanh，也是[-1,1]，这就对应起来了。</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">2。论文里面optimize G是min log(1 - D)，但在实际训练的时候可以用 max log(D)</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">3。对于噪声z，别用均匀（uniform）分布，用高斯分布。</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">4。可以用instance norm代替 batch norm。还有就是real放一起，generated放一起（感觉这个是废话QAQ）。</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">5。避免稀疏的gradients：RELU，Maxpool那些。这一点我认为原因是不像做辨别式的网络，判别式的，尽可能提取重要的信息，其实一些对预测影响不大的信息都被忽略掉了。但是GAN不同，是生成式的模型，所以要尽可能的表现出细节方面的内容，所以避免使用稀疏的这些？</p><ul style="padding: 0px; margin: 1.4em 0px; display: table; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;"><li style="list-style: none; display: table-row;">LeakyRelu</li><li style="list-style: none; display: table-row;">For Downsampling, use: Average Pooling, Conv2d + stride</li><li style="list-style: none; display: table-row;">For Upsampling, use: PixelShuffle, ConvTranspose2d + stride</li></ul><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">6。可以把label为1的（real）变到0.7~1.2，label为0的变到0~0.3。这个可以深入想想。</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">7。能用DCGAN就用，用不了的话用混合模型，KL+GAN，VAE+GAN之类的。</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">8。借用RL训练技巧。</p><ul style="padding: 0px; margin: 1.4em 0px; display: table; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;"><li style="list-style: none; display: table-row;">Keep a replay buffer of past generations and occassionally show them</li><li style="list-style: none; display: table-row;">Keep checkpoints from the past of G and D and occassionaly swap them out for a few iterations</li></ul><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">9。用ADAM！或者是D可以用SGD，G用ADAM</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">10。注意训练过程，尽早发现训练失败，不至于训练好长时间最后才发现，浪费时间。</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">11。最好别尝试设置一些常量去balance G与D的训练过程。（他们说这个work很难做。我觉得有时间的话其实还是可以试一下的。）</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">12。如果你对real有相应的label，用label，AC-GAN。加入label信息，可以降低生成的难度，这个应该可以想的通。</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">13。加噪声？作用是improve生成内容得diversity?</p><ul style="padding: 0px; margin: 1.4em 0px; display: table; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;"><li style="list-style: none; display: table-row;">Add some artificial noise to inputs to D (Arjovsky et. al., Huszar, 2016)</li><li style="list-style: none; display: table-row;">adding gaussian noise to every layer of generator (Zhao et. al. EBGAN)</li></ul><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">14。【not sure】多训练D，特别是加噪声的时候。</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">15。【not sure】batch D，感觉貌似是和pix2pix中的patchGAN有点像？</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">16。CGAN，我一直觉得CGAN这种才符合人类学习的思路。原始的GAN就太粗暴了，就好像什么都不知道，然后两个人D与G讨论交流对抗，产生的都是一些前人没有做过的工作，开篇的工作，所以比较困难一些，但是CGAN的话就有了一定的前提，也就是技术积累，所以比较简单一些。有点类似科研中的大牛挖坑，开辟新方向（GAN）。小牛填坑（CGAN）。</p><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">17。在G中的几层中用dropout（50%）。这个有一篇论文，还没看。</p><br style="color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;" /><p style="margin: 1.4em 0px; color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, &quot;Helvetica Neue&quot;, &quot;PingFang SC&quot;, &quot;Microsoft YaHei&quot;, &quot;Source Han Sans SC&quot;, &quot;Noto Sans CJK SC&quot;, &quot;WenQuanYi Micro Hei&quot;, sans-serif; font-size: medium; font-weight: 400; background-color: #ffffff;">读完这些感觉自己想要设计GAN的话，应该有个系统的认识了，不会觉得自己好像有哪些重要的地方还不知道，很不踏实感觉。这种感觉对我这种强迫症的感觉很不爽啊！！看完以后顿时舒服了很多~~~</p></h1><a href="https://zhuanlan.zhihu.com/p/27725664">https://zhuanlan.zhihu.com/p/27725664</a></a><img src ="http://www.cppblog.com/guijie/aggbug/216325.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2019-04-02 05:42 <a href="http://www.cppblog.com/guijie/archive/2019/04/02/216325.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>logits</title><link>http://www.cppblog.com/guijie/archive/2019/03/27/216316.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Tue, 26 Mar 2019 21:01:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2019/03/27/216316.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/216316.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2019/03/27/216316.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/216316.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/216316.html</trackback:ping><description><![CDATA[<div>What is the meaning of the word logits in TensorFlow?</div><div>In the following TensorFlow function, we must feed the activation of artificial neurons in the final layer. That I understand. But I don't understand why it is called logits? Isn't that a mathematical function?</div><div></div><div>loss_function = tf.nn.softmax_cross_entropy_with_logits(</div><div>&nbsp; &nbsp; &nbsp;logits = last_layer,</div><div>&nbsp; &nbsp; &nbsp;labels = target_output</div><div>)<br /><br />For example, in the last layer of the discriminator of generative adversarial networks (GAN), we will use sigmoid(logits) to get the output of D. This is discussed&nbsp;with Zhengxia.</div><div>Reference:</div><div>https://stackoverflow.com/questions/41455101/what-is-the-meaning-of-the-word-logits-in-tensorflow</div><img src ="http://www.cppblog.com/guijie/aggbug/216316.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2019-03-27 05:01 <a href="http://www.cppblog.com/guijie/archive/2019/03/27/216316.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>Github</title><link>http://www.cppblog.com/guijie/archive/2019/01/15/216200.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Mon, 14 Jan 2019 20:16:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2019/01/15/216200.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/216200.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2019/01/15/216200.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/216200.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/216200.html</trackback:ping><description><![CDATA[<span style="color: #3c4043; font-family: Roboto, &quot;Noto Sans SC&quot;, Helvetica, Arial, sans-serif; background-color: #f1f3f4;"></span><div>例如"PyTorch_tutorial_0.0.5"的pytorch的第八页Code/1_data_prepare/1_1_cifar10_to_png.py,为什么Github上"链接另存为"与Clone or download出来的有区别?链接另存为虽然保存的也是.py文件，再在此文件后加上.html后缀，打开就是该网页，保存的实际是网页</div><div><br /><div>谷歌搜索: Github difference clone download<br /><span style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">When you&nbsp;</span><strong style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">clone</strong><span style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">&nbsp;you get a copy of the history and it is a functional&nbsp;</span><strong style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">git</strong><span style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">&nbsp;repo.</span><strong style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;"> Downloading</strong><span style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">&nbsp;a repository just&nbsp;</span><strong style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">downloads</strong><span style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">&nbsp;the files from the most recent commit of the default branch. It doesn't&nbsp;</span><strong style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">download</strong><span style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">&nbsp;any of the files in the .</span><strong style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">git</strong><span style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">&nbsp;folder. ... It's as if</span><strong style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">git</strong><span style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">&nbsp;never existed, and all you have is a copy of the code/files. <br />Zhengxia&nbsp;said, if you&nbsp;clone&nbsp;a project, you&nbsp;can have v1, v2 and the&nbsp;most recent&nbsp;version. If you&nbsp;download&nbsp;a project, you&nbsp;can only have the&nbsp;</span><span style="color: #222222; font-family: arial, sans-serif; font-size: 16px; background-color: #ffffff;">most recent&nbsp;version.&nbsp;<br /><br /></span></div></div><div>怎么clone,没什么特别讲究吧,还有网址? 就是点击Clone or download, Download ZIP,这样就是Download; 选择"Open in Desktop"， 你有没有安装GitHub Desktop？Yuxiang说正常就用Download就行了；没有安装GitHub Desktop，他用的Git bash，利用&#8220;Clone or download&#8221;的网址，就能clone，这个需要的时候再问。<br /></div><div>This is with Yuxiang and Zhengxia's help.<br /><br /><div>20200617 read twice https://guides.github.com/activities/hello-world/<br /><br /><span style="color: #444444; font-family: &quot;Helvetica Neue&quot;, Helvetica, Arial, sans-serif; font-size: 18px; font-variant-numeric: normal; font-variant-east-asian: normal; line-height: 20.4px; widows: 1; background-color: #ffffff;">learn GitHub&#8217;s Pull Request workflow, a popular way to create and review code.<br /><br /><h2>Step 2. Create a Branch</h2><p style="box-sizing: border-box; margin: 15px 0px; font-variant-numeric: normal; font-variant-east-asian: normal; line-height: 20.4px;"><strong style="box-sizing: border-box;">Branching</strong>&nbsp;is the way to work on different versions of a repository at one time.</p></span></div><span style="color: #444444; font-family: &quot;Helvetica Neue&quot;, Helvetica, Arial, sans-serif; font-size: 18px; font-variant-numeric: normal; font-variant-east-asian: normal; line-height: 30.6px; widows: 1; background-color: #ffffff;"><span style="font-variant-numeric: normal; font-variant-east-asian: normal; line-height: 30.6px;">On GitHub, saved changes are called&nbsp;</span><em style="box-sizing: border-box; font-variant-numeric: normal; font-variant-east-asian: normal; line-height: 30.6px;">commits</em><span style="font-variant-numeric: normal; font-variant-east-asian: normal; line-height: 30.6px;">. Each commit has an associated&nbsp;</span><em style="box-sizing: border-box; font-variant-numeric: normal; font-variant-east-asian: normal; line-height: 30.6px;">commit message</em><span style="font-variant-numeric: normal; font-variant-east-asian: normal; line-height: 30.6px;">, which is a description explaining why a particular change was made. Commit messages capture the history of your changes, so other contributors can understand what you&#8217;ve done and why.<br /><span style="font-variant-numeric: normal; font-variant-east-asian: normal; line-height: 20.4px;">By using GitHub&#8217;s&nbsp;</span><a href="https://help.github.com/articles/about-writing-and-formatting-on-github/#text-formatting-toolbar" style="box-sizing: border-box; color: #4183c4; text-decoration-line: none; font-variant-numeric: normal; font-variant-east-asian: normal; line-height: 20.4px;">@mention system</a><span style="font-variant-numeric: normal; font-variant-east-asian: normal; line-height: 20.4px;">&nbsp;in your pull request message, you can ask for feedback from specific people or teams, whether they&#8217;re down the hall or 10 time zones away.</span><br /></span><br /></span></div><span style="background-color: #f1f3f4; color: #3c4043; font-family: Roboto, &quot;Noto Sans SC&quot;, Helvetica, Arial, sans-serif;"></span><img src ="http://www.cppblog.com/guijie/aggbug/216200.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2019-01-15 04:16 <a href="http://www.cppblog.com/guijie/archive/2019/01/15/216200.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>[zz]正定、超定、欠定矩阵</title><link>http://www.cppblog.com/guijie/archive/2019/01/12/216191.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Fri, 11 Jan 2019 18:43:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2019/01/12/216191.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/216191.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2019/01/12/216191.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/216191.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/216191.html</trackback:ping><description><![CDATA[<div>正定、超定、欠定矩阵</div><div>正定</div><div>定义</div><div>广义定义</div><div>设M是n阶方阵，如果对任何非零向量z，都有 z&#8242;Mz&gt;0z&#8242;Mz&gt;0，其中z&#8217; 表示z的转置，就称M正定矩阵。[1]&nbsp;</div><div>例如：B为n阶矩阵，E为单位矩阵，a为正实数。aE+BaE+B在a充分大时，aE+BaE+B为正定矩阵。（B必须为对称阵）。</div><div></div><div>狭义定义</div><div>一个n阶的实对称矩阵M是正定的当且仅当对于所有的非零实系数向量z，都有z&#8242;Mz&gt;0z&#8242;Mz&gt;0。其中z&#8217;表示z的转置。</div><div></div><div>性质</div><div>正定矩阵在合同变换下可化为标准型， 即单位矩阵。</div><div></div><div>合同矩阵：两个实对称矩阵A和B，如存在可逆矩阵P，使得A=PTBPA=PTBP，就称矩阵A和B互为合同矩阵，并且称由A到B的变换叫合同变换。</div><div></div><div>所有特征值大于零的对称矩阵（或厄米矩阵）是正定矩阵。</div><div></div><div>判定定理1：对称阵A为正定的充分必要条件是：A的特征值全为正。&nbsp;</div><div>判定定理2：对称阵A为正定的充分必要条件是：A的各阶顺序主子式都为正。&nbsp;</div><div>判定定理3：任意阵A为正定的充分必要条件是：A合同于单位阵。</div><div></div><div>1.正定矩阵一定是非奇异的。非奇异矩阵的定义：若n阶矩阵A的行列式不为零，即|A|&#8800;0|A|&#8800;0。&nbsp;</div><div>2.正定矩阵的任一主子矩阵也是正定矩阵。&nbsp;</div><div>3.若A为n阶对称正定矩阵，则存在唯一的主对角线元素都是正数的下三角阵L，使得A=L&#8727;L'A=L&#8727;L&#8242;，此分解式称为 正定矩阵的乔列斯基（Cholesky）分解。&nbsp;</div><div>4.若A为n阶正定矩阵，则A为n阶可逆矩阵。</div><div></div><div>矩阵的每一行代表一个方程，m行代表m个线性联立方程。 n列代表n个变量。如果m是独立方程数，根据m</div><div></div><div>超定方程组</div><div>方程个数大于未知量个数的方程组。</div><div></div><div>对于方程组 Ra=yRa=y，R为n&#215;mn&#215;m矩阵，如果R列满秩，且n&gt;mn&gt;m。</div><div></div><div>超定方程一般是不存在解的矛盾方程。</div><div></div><div>例如，如果给定的三点不在一条直线上，我们将无法得到这样一条直线，使得这条直线同时经过给定这三个点。 也就是说给定的条件（限制）过于严格， 导致解不存在。在实验数据处理和曲线拟合问题中，求解超定方程组非常普遍。比较常用的方法是最小二乘法。形象的说，就是在无法完全满足给定的这些条件的情况下，求一个最接近的解。</div><div></div><div>曲线拟合的最小二乘法要解决的问题，实际上就是求以上超定方程组的最小二乘解的问题。</div><div></div><div>欠定方程组</div><div>方程个数小于未知量个数的方程组。</div><div></div><div>对于方程组Ra=yRa=y，RR为n&#215;mn&#215;m 矩阵，且n&lt;mn&lt;m。则方程组有无穷多组解，此时称方程组为欠定方程组。</div><div></div><div>内点法和梯度投影法是目前解欠定方程组的常用方法。</div><div>---------------------&nbsp;</div><div>原文：https://blog.csdn.net/hfdwdjl/article/details/44133845&nbsp;<br /><br />评论:欠定方程，<span style="font-family: 楷体;">这种定义不太严格，因为</span><span style="font-family: &quot;Times New Roman&quot;, serif;">n&lt;m</span><span style="font-family: 楷体;">未必有无穷多解，极端例子</span><span style="font-family: &quot;Times New Roman&quot;, serif;">,x+y+z=1</span><span style="font-family: 楷体;">和</span><span style="font-family: &quot;Times New Roman&quot;, serif;">x+y+z=3</span><span style="font-family: 楷体;">，两个方程，三个未知数，无解。</span></div>  <span style="font-size: 10.5pt; font-family: 楷体;">还是要应该要用到the notes of linear algebra&nbsp;</span><span style="font-size:10.5pt;font-family:&quot;Times New Roman&quot;,serif;">P41反页。下面维基上这个应该是严格的，有无穷多解的就叫欠定。<br /></span>https://zh.wikipedia.org/wiki/%E7%BA%BF%E6%80%A7%E6%96%B9%E7%A8%8B%E7%BB%84<img src ="http://www.cppblog.com/guijie/aggbug/216191.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2019-01-12 02:43 <a href="http://www.cppblog.com/guijie/archive/2019/01/12/216191.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>Stepwise Feature Selection</title><link>http://www.cppblog.com/guijie/archive/2018/12/06/216107.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Wed, 05 Dec 2018 22:32:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2018/12/06/216107.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/216107.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2018/12/06/216107.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/216107.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/216107.html</trackback:ping><description><![CDATA[<span style="font-family: &quot;Microsoft YaHei&quot;; font-size: 32px; font-weight: 700;">Feature Selection<br /><br /></span><span style="font-family: &quot;Microsoft YaHei&quot;; font-size: medium;">However, there are some heuristic approaches that are often useful. We will look at the following approaches:<br /></span><ul style="font-family: &quot;Microsoft YaHei&quot;; font-size: medium;"><li><a href="https://www.cs.princeton.edu/courses/archive/fall08/cos436/Duda/FS/exhaust.htm">Exhaustive methods</a><br /><br /></li><li><a href="https://www.cs.princeton.edu/courses/archive/fall08/cos436/Duda/FS/stepwise.htm">Stepwise selection</a><br /><br /></li><li><a href="https://www.cs.princeton.edu/courses/archive/fall08/cos436/Duda/FS/combine.htm">Feature combination</a></li></ul><h1 style="font-family: &quot;Microsoft YaHei&quot;;"><center>Stepwise Selection</center></h1><span style="font-family: &quot;Microsoft YaHei&quot;; font-size: medium;">A common suggestion for avoiding the consideration of all subsets is to use&nbsp;</span><strong style="font-family: &quot;Microsoft YaHei&quot;; font-size: medium;">stepwise selection</strong><span style="font-family: &quot;Microsoft YaHei&quot;; font-size: medium;">. There are two standard approaches:</span><ul style="font-family: &quot;Microsoft YaHei&quot;; font-size: medium;"><li><strong>Forward selection</strong>. Begin by finding the best single feature, and commit to it. In general, given a set of selected features, add the feature that improves performance most.<br /><br /></li><li><strong>Backward elimination</strong>. From a set of remaining features, repeatedly delete the feature that reduces performance the least.<br /><br /><br />Reference:<br /><div>https://www.cs.princeton.edu/courses/archive/fall08/cos436/Duda/FS/FS_home.htm<br /><div>https://www.cs.princeton.edu/courses/archive/fall08/cos436/Duda/FS/stepwise.htm</div></div></li></ul><img src ="http://www.cppblog.com/guijie/aggbug/216107.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2018-12-06 06:32 <a href="http://www.cppblog.com/guijie/archive/2018/12/06/216107.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>Affine Functions (仿射函数)</title><link>http://www.cppblog.com/guijie/archive/2018/10/26/216022.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Thu, 25 Oct 2018 19:09:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2018/10/26/216022.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/216022.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2018/10/26/216022.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/216022.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/216022.html</trackback:ping><description><![CDATA[<div><strong>Affine Functions</strong></div><div><strong>Affine Functions in 1D:</strong></div><div>An affine function is a function composed of a linear function + a constant and its graph is a straight line. The general equation for an affine function in 1D is: <strong>y = Ax + c</strong>.&nbsp;</div><div></div><div>An affine function demonstrates an affine transformation which is equivalent to a linear transformation followed by a translation. In an affine transformation there are certain attributes of the graph that are preserved. These include:&nbsp;</div><div></div><div>&#8226;<span style="white-space:pre">	</span>If three points all belong to the same line then under an affine transformation those three points will still belong to the same line and the middle point will still be in the middle.</div><div>&#8226;<span style="white-space:pre">	</span>Parrallel lines remain parrallel.</div><div>&#8226;<span style="white-space:pre">	</span>Concurrent lines remain concurrent.</div><div>&#8226;<span style="white-space:pre">	</span>The ratio of length of line segments of a given line remains constant.</div><div>&#8226;<span style="white-space:pre">	</span>The ratio of areas of two triangles remains constant.</div><div>&#8226;<span style="white-space:pre">	</span>Ellipses remain ellipses and the same is true for parabolas and hyperbolas.<br /></div><div><strong>Affine Functions in 2D:</strong></div><div>In 2D the equation of an affine function is <strong>f(x,y)=Ax + By + C</strong>.&nbsp;</div><div></div><div>The graph of a wave in 2D as shown in the next section shows the example of a graph of a 2D affine function.&nbsp;<br /></div><div></div><div></div><div><strong>Affine Functions in 3D:</strong></div><div>In 3D the equation of an affine function is <strong>f(x,y,z)=Ax + By + Cz + D</strong>.&nbsp;<br /><br />Reference:<br /><div>http://www.math.ubc.ca/~cass/courses/m309-03a/a1/olafson/affine_fuctions.htm</div></div><div></div><img src ="http://www.cppblog.com/guijie/aggbug/216022.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2018-10-26 03:09 <a href="http://www.cppblog.com/guijie/archive/2018/10/26/216022.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>[zz] Adversarial Nets Papers</title><link>http://www.cppblog.com/guijie/archive/2018/09/28/215982.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Fri, 28 Sep 2018 14:02:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2018/09/28/215982.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/215982.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2018/09/28/215982.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/215982.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/215982.html</trackback:ping><description><![CDATA[&nbsp;&nbsp;&nbsp;&nbsp; 摘要: The classic about Generative Adversarial NetworksFirst paper&nbsp;[Generative Adversarial Nets]&nbsp;[Paper]&nbsp;[Code](the First paper of GAN)Unclassified&nbsp;[Deep Generative Image Models using a ...&nbsp;&nbsp;<a href='http://www.cppblog.com/guijie/archive/2018/09/28/215982.html'>阅读全文</a><img src ="http://www.cppblog.com/guijie/aggbug/215982.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2018-09-28 22:02 <a href="http://www.cppblog.com/guijie/archive/2018/09/28/215982.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>[zz] 2018 CVPR GAN 相关论文调研</title><link>http://www.cppblog.com/guijie/archive/2018/09/28/215981.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Fri, 28 Sep 2018 14:00:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2018/09/28/215981.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/215981.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2018/09/28/215981.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/215981.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/215981.html</trackback:ping><description><![CDATA[<h3><span style="color:#FF0000;">风格迁移</span></h3>  <p style="margin-left:0cm;"><strong>1. </strong><strong>PairedCycleGAN: Asymmetric Style Transfer for Applying and Removing Makeup</strong></p>  <p style="margin-left:0cm;">（给人脸化妆的风格转移）</p>  <p style="margin-left:0cm;"><span style="color:#0563c1;"><u><a href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Chang_PairedCycleGAN_Asymmetric_Style_CVPR_2018_paper.pdf" rel="nofollow" target="_blank">http://openaccess.thecvf.com/content_cvpr_2018/papers/Chang_PairedCycleGAN_Asymmetric_Style_CVPR_2018_paper.pdf</a></u></span></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>2.CartoonGAN: Generative Adversarial Networks for Photo Cartoonization</strong></p>  <p style="margin-left:0cm;">（将图片转化为卡通风格的GAN）</p>  <p style="margin-left:0cm;"><span style="color:#0563c1;"><u><a href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Chen_CartoonGAN_Generative_Adversarial_CVPR_2018_paper.pdf" rel="nofollow" target="_blank">http://openaccess.thecvf.com/content_cvpr_2018/papers/Chen_CartoonGAN_Generative_Adversarial_CVPR_2018_paper.pdf</a></u></span></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>3.StarGAN: Unified Generative Adversarial Networks for Multi-Domain Image-to-Image Translation</strong></p>  <p style="margin-left:0cm;">（人脸多种风格转换）</p>  <p style="margin-left:0cm;"><span style="color:#0563c1;"><u><a href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Choi_StarGAN_Unified_Generative_CVPR_2018_paper.pdf" rel="nofollow" target="_blank">http://openaccess.thecvf.com/content_cvpr_2018/papers/Choi_StarGAN_Unified_Generative_CVPR_2018_paper.pdf</a></u></span></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>4.Multi-Content GAN for Few-Shot Font Style Transfer</strong></p>  <p style="margin-left:0cm;">（字体风格转换）</p>  <p style="margin-left:0cm;"><span style="color:#0563c1;"><u><a href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Azadi_Multi-Content_GAN_for_CVPR_2018_paper.pdf" rel="nofollow" target="_blank">http://openaccess.thecvf.com/content_cvpr_2018/papers/Azadi_Multi-Content_GAN_for_CVPR_2018_paper.pdf</a></u></span></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>5.DA-GAN: Instance-level Image Translation by Deep Attention Generative Adversarial Networks</strong></p>  <p style="margin-left:0cm;">（图到图转换）</p>  <p style="margin-left:0cm;"><span style="color:#0563c1;"><u><a href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Ma_DA-GAN_Instance-Level_Image_CVPR_2018_paper.pdf" rel="nofollow" target="_blank">http://openaccess.thecvf.com/content_cvpr_2018/papers/Ma_DA-GAN_Instance-Level_Image_CVPR_2018_paper.pdf</a></u></span></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>6. </strong><strong>Conditional Image-to-Image translation</strong></p>  <p style="margin-left:0cm;">（图到图的转换）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Lin_Conditional_Image-to-Image_Translation_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t2"></a><span style="color:#FF0000;">图片处理</span></h3>  <p style="margin-left:0cm;"><strong>1. </strong><strong>DeblurGAN: Blind Motion Deblurring Using Conditional Adversarial Networks</strong></p>  <p style="margin-left:0cm;">（去模糊）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Kupyn_DeblurGAN_Blind_Motion_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>2.Attentive Generative Adversarial Network for Raindrop Removal from A Single Image</strong></p>  <p style="margin-left:0cm;">（去除图片中的雨滴）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Qian_Attentive_Generative_Adversarial_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>3. </strong><strong>Deep Photo Enhancer: Unpaired Learning for Image Enhancement from Photographs with GANs</strong></p>  <p style="margin-left:0cm;">（用于照片增强）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Chen_Deep_Photo_Enhancer_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>4. </strong><strong>SeGAN: Segmenting and Generating the Invisible</strong></p>  <p style="margin-left:0cm;">（去遮挡）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Ehsani_SeGAN_Segmenting_and_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>5.Stacked Conditional Generative Adversarial Networks for Jointly Learning Shadow Detection and Shadow Removal</strong></p>  <p style="margin-left:0cm;">（去阴影）</p>  <p style="margin-left:0cm;"><span style="color:#0563c1;"><u><a href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Wang_Stacked_Conditional_Generative_CVPR_2018_paper.pdf" rel="nofollow" target="_blank">http://openaccess.thecvf.com/content_cvpr_2018/papers/Wang_Stacked_Conditional_Generative_CVPR_2018_paper.pdf</a></u></span></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>6.Image Blind Denoising With Generative Adversarial Network Based Noise Modeling</strong></p>  <p style="margin-left:0cm;">（去噪声）</p>  <p style="margin-left:0cm;"><span style="color:#0563c1;"><u><a href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Chen_Image_Blind_Denoising_CVPR_2018_paper.pdf" rel="nofollow" target="_blank">http://openaccess.thecvf.com/content_cvpr_2018/papers/Chen_Image_Blind_Denoising_CVPR_2018_paper.pdf</a></u></span></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>7. Single Image Dehazing via Conditional Generative Adversarial Network</strong></p>  <p style="margin-left:0cm;">（去噪声）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Li_Single_Image_Dehazing_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t3"></a><span style="color:#FF0000;">图片生成</span></h3>  <p style="margin-left:0cm;"><strong>1. ST-GAN: Spatial Transformer Generative Adversarial Networks for Image Compositing</strong></p>  <p style="margin-left:0cm;">（空间转换生成图片）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Lin_ST-GAN_Spatial_Transformer_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;">2. <strong>SketchyGAN: Towards Diverse and Realistic Sketch to Image Synthesis</strong></p>  <p style="margin-left:0cm;">（由边框生成图片）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Chen_SketchyGAN_Towards_Diverse_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>3. </strong><strong>TextureGAN: Controlling Deep Image Synthesis with Texture Patches</strong></p>  <p style="margin-left:0cm;">（由纹路生成图片）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Xian_TextureGAN_Controlling_Deep_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>4. Eye In-Painting with Exemplar Generative Adversarial Networks</strong></p>  <p style="margin-left:0cm;">（给人物画眼睛）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Dolhansky_Eye_In-Painting_With_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>5.Photographic Text-to-Image Synthesis with a Hierarchically-nested Adversarial Network</strong></p>  <p style="margin-left:0cm;">（文本生成图片）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Zhang_Photographic_Text-to-Image_Synthesis_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>6. Logo Synthesis and Manipulation with Clustered Generative Adversarial Networks</strong></p>  <p style="margin-left:0cm;">（生成logo）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Sage_Logo_Synthesis_and_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>7. </strong><strong>Cross-View Image Synthesis Using Conditional GANs</strong></p>  <p style="margin-left:0cm;">（街区俯视图和直视转换）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Regmi_Cross-View_Image_Synthesis_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>8. </strong><strong>AttnGAN: Fine-Grained Text to Image Generation with Attentional Generative Adversarial Networks</strong></p>  <p style="margin-left:0cm;">（文本生成图片）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Xu_AttnGAN_Fine-Grained_Text_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>9. High-Resolution Image Synthesis and Semantic Manipulation with Conditional GANs</strong></p>  <p style="margin-left:0cm;">（图像高分辨率）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Wang_High-Resolution_Image_Synthesis_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t4"></a><span style="color:#FF0000;">人脸相关</span></h3>  <p style="margin-left:0cm;"><strong>1. </strong><strong>Finding Tiny Faces in the Wild with Generative Adversarial Network</strong></p>  <p style="margin-left:0cm;">（对低分辨率的人脸检测）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Bai_Finding_Tiny_Faces_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>2. </strong><strong>Learning Face Age Progression: A Pyramid Architecture of GANs</strong></p>  <p style="margin-left:0cm;">（预测年龄）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Yang_Learning_Face_Age_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>3. </strong><strong>Super-FAN: Integrated facial landmark localization and super-resolution of real-world low resolution faces in arbitrary poses with GANs</strong></p>  <p style="margin-left:0cm;">（对低分辨率人脸超分辨率）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Bulat_Super-FAN_Integrated_Facial_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;"><strong>4. <a href="http://openaccess.thecvf.com/content_cvpr_2018/html/Bao_Towards_Open-Set_Identity_CVPR_2018_paper.html" rel="nofollow" target="_blank"><span style="color:#000000;">Towards Open-Set Identity Preserving Face Synthesis</span></a></strong></p>  <p style="margin-left:0cm;">（人脸合成）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Bao_Towards_Open-Set_Identity_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>5. Weakly Supervised Facial Action Unit Recognition through Adversarial Training</strong></p>  <p style="margin-left:0cm;">（人脸表情识别）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Peng_Weakly_Supervised_Facial_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>6.FaceID-GAN: Learning a Symmetry Three-Player GAN for Identity-Preserving Face Synthesis</strong></p>  <p style="margin-left:0cm;">（生成多角度人脸）</p>  <p style="margin-left:0cm;"><span style="color:#0563c1;"><u><a href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Shen_FaceID-GAN_Learning_a_CVPR_2018_paper.pdf" rel="nofollow" target="_blank">http://openaccess.thecvf.com/content_cvpr_2018/papers/Shen_FaceID-GAN_Learning_a_CVPR_2018_paper.pdf</a></u></span></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>7. </strong><strong>UV-GAN: Adversarial Facial UV Map Completion for Pose-invariant Face Recognition</strong></p>  <p style="margin-left:0cm;">（人脸生成）</p>  <p style="margin-left:0cm;"><span style="color:#0563c1;"><u><a href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Deng_UV-GAN_Adversarial_Facial_CVPR_2018_paper.pdf" rel="nofollow" target="_blank">http://openaccess.thecvf.com/content_cvpr_2018/papers/Deng_UV-GAN_Adversarial_Facial_CVPR_2018_paper.pdf</a></u></span></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>8.Face Aging with Identity-Preserved Conditional Generative Adversarial Networks</strong></p>  <p style="margin-left:0cm;">（人脸老化）</p>  <p style="margin-left:0cm;"><span style="color:#0563c1;"><u><a href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Wang_Face_Aging_With_CVPR_2018_paper.pdf" rel="nofollow" target="_blank">http://openaccess.thecvf.com/content_cvpr_2018/papers/Wang_Face_Aging_With_CVPR_2018_paper.pdf</a></u></span></p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t5"></a><span style="color:#FF0000;">人体相关</span></h3>  <p style="margin-left:0cm;"><strong>1. </strong><strong>Deformable GANs for Pose-based Human Image Generation</strong></p>  <p style="margin-left:0cm;">（人物姿态迁移）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Siarohin_Deformable_GANs_for_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>2. </strong><strong>Social GAN: Socially Acceptable Trajectories with Generative Adversarial Networks</strong></p>  <p style="margin-left:0cm;">(用GAN生成人行为轨迹追踪)</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Gupta_Social_GAN_Socially_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>3. </strong><strong>GANerated Hands for Real-Time 3D Hand Tracking from Monocular RGB</strong></p>  <p style="margin-left:0cm;">（用GAN生成的手势图片做手势追踪的数据集）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Mueller_GANerated_Hands_for_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>4. Multistage Adversarial Losses for Pose-Based Human Image Synthesis</strong></p>  <p style="margin-left:0cm;">（人体姿态合成）</p>  <p style="margin-left:0cm;"><span style="color:#0563c1;"><u><a href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Si_Multistage_Adversarial_Losses_CVPR_2018_paper.pdf" rel="nofollow" target="_blank">http://openaccess.thecvf.com/content_cvpr_2018/papers/Si_Multistage_Adversarial_Losses_CVPR_2018_paper.pdf</a></u></span></p>  <p style="margin-left:0cm;"><strong>5. Disentangled Person Image Generation</strong></p>  <p style="margin-left:0cm;">（人体合成）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Ma_Disentangled_Person_Image_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t6"></a><span style="color:#FF0000;">domain adaptation</span></h3>  <p>（这个没来得及找了，可能转行咯~ 唉）</p>  <p style="margin-left:0cm;"><strong>1. Generate to Adapt: Aligning Domains Using Generative Adversarial Networks</strong></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>2. Re-Weighted Adversarial Adaptation Network for Unsupervised Domain Adaptation</strong></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>3. Adversarial Feature Augmentation for Unsupervised Domain Adaptation</strong></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>4. Domain Generalization With Adversarial Feature Learning</strong></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>5. Image to Image Translation for Domain Adaptation</strong></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>6. Duplex Generative Adversarial Network for Unsupervised Domain Adaptation</strong></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>7. Conditional Generative Adversarial Network for Structured Domain Adaptation</strong></p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t7"></a><span style="color:#FF0000;">目标跟踪检测</span></h3>  <p style="margin-left:0cm;"><strong>1.Generative Adversarial Learning Towards Fast Weakly Supervised Detection</strong></p>  <p style="margin-left:0cm;">（弱监督检测）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Shen_Generative_Adversarial_Learning_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>2. SINT++: Robust Visual Tracking via Adversarial Positive Instance Generation</strong></p>  <p style="margin-left:0cm;">（对抗学习生成轨迹样本）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Wang_SINT_Robust_Visual_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>3. VITAL: VIsual Tracking via Adversarial Learning</strong></p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Song_VITAL_VIsual_Tracking_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t8"></a><span style="color:#FF0000;">GAN</span><span style="color:#FF0000;">模型优化</span></h3>  <p style="margin-left:0cm;"><strong>1. </strong><strong>SGAN: An Alternative Training of Generative Adversarial Network</strong></p>  <p style="margin-left:0cm;">（替代训练GAN）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Chavdarova_SGAN_An_Alternative_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>2. </strong><strong>GAGAN: Geometry-Aware Generative Adversarial Networks</strong></p>  <p style="margin-left:0cm;">（一种关注几何外形的GAN）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Kossaifi_GAGAN_Geometry-Aware_Generative_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>3.Global versus Localized Generative Adversarial Nets</strong></p>  <p style="margin-left:0cm;">(局部优化GAN)</p>  <p style="margin-left:0cm;"><span style="color:#0563c1;"><u><a href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Qi_Global_Versus_Localized_CVPR_2018_paper.pdf" rel="nofollow" target="_blank">http://openaccess.thecvf.com/content_cvpr_2018/papers/Qi_Global_Versus_Localized_CVPR_2018_paper.pdf</a></u></span></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>4. Generative Adversarial Image Synthesis with Decision Tree Latent Controller</strong></p>  <p style="margin-left:0cm;">（决策树）</p>  <p style="margin-left:0cm;"><span style="color:#0563c1;"><u><a href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Kaneko_Generative_Adversarial_Image_CVPR_2018_paper.pdf" rel="nofollow" target="_blank">http://openaccess.thecvf.com/content_cvpr_2018/papers/Kaneko_Generative_Adversarial_Image_CVPR_2018_paper.pdf</a></u></span></p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>5. Unsupervised Deep Generative Adversarial Hashing Network</strong></p>  <p style="margin-left:0cm;">（哈希GAN）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Dizaji_Unsupervised_Deep_Generative_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>6. Multi-Agent Diverse Generative Adversarial Networks</strong></p>  <p style="margin-left:0cm;">（多个生成器GAN）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Ghosh_Multi-Agent_Diverse_Generative_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>7. </strong><strong>Duplex Generative Adversarial Network for Unsupervised Domain Adaptation</strong></p>  <p style="margin-left:0cm;">（双鉴别器GAN）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Hu_Duplex_Generative_Adversarial_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t9"></a><span style="color:#FF0000;">图像分割</span></h3>  <p style="margin-left:0cm;"><strong>1. </strong><strong>Translating and Segmenting Multimodal Medical Volumes With Cycle- and Shape-Consistency Generative Adversarial Network</strong></p>  <p style="margin-left:0cm;">（图像分割）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Zhang_Translating_and_Segmenting_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t10"></a><span style="color:#FF0000;">行人重识别</span></h3>  <p style="margin-left:0cm;"><strong>1. </strong><strong>Person Transfer GAN to Bridge Domain Gap for Person Re-Identification</strong></p>  <p style="margin-left:0cm;">（用GAN生成的人体检测的图片）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Wei_Person_Transfer_GAN_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <p style="margin-left:0cm;"><strong>2. Image-Image Domain Adaptation with Preserved Self-Similarity and Domain-Dissimilarity for Person Re-identification</strong></p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Deng_Image-Image_Domain_Adaptation_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t11"></a><span style="color:#FF0000;">视觉特征提取</span></h3>  <p style="margin-left:0cm;"><strong>1. Visual Feature Attribution using Wasserstein GANs</strong></p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Baumgartner_Visual_Feature_Attribution_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t12"></a><span style="color:#FF0000;">域自适应学习</span></h3>  <p style="margin-left:0cm;"><strong>1. </strong><strong>Generate To Adapt: Aligning Domains using Generative Adversarial Networks</strong></p>  <p style="margin-left:0cm;">（视觉域自适应）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Sankaranarayanan_Generate_to_Adapt_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t13"></a><span style="color:#FF0000;">图像检索</span></h3>  <p style="margin-left:0cm;"><strong>1. HashGAN: Deep Learning to Hash with Pair Conditional Wasserstein GAN</strong></p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Cao_HashGAN_Deep_Learning_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t14"></a><span style="color:#FF0000;">迁移学习</span></h3>  <p style="margin-left:0cm;"><strong>1.Partial Transfer Learning With Selective Adversarial Networks</strong></p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Cao_Partial_Transfer_Learning_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t15"></a><span style="color:#FF0000;">视频生成</span></h3>  <p style="margin-left:0cm;"><strong>1. </strong><strong>MoCoGAN: Decomposing Motion and Content for Video Generation</strong></p>  <p style="margin-left:0cm;">（用GAN生成视频）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Tulyakov_MoCoGAN_Decomposing_Motion_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;"><strong>2. Learning to Generate Time-Lapse Videos Using Multi-Stage Dynamic Generative Adversarial Networks</strong></p>  <p style="margin-left:0cm;">（生成延时视频）</p>  <p style="margin-left:0cm;">http://openaccess.thecvf.com/content_cvpr_2018/papers/Xiong_Learning_to_Generate_CVPR_2018_paper.pdf</p>  <p style="margin-left:0cm;">&nbsp;</p>  <h3><a name="t16"></a><strong>小结：</strong></h3>  <p style="margin-left:0cm;">&nbsp; 可以看出GAN相关的论文还不少呀，各个方面的都有，可是我个人觉得，可能没有那种特别厉害的吧~hh</p>--------------------- 本文来自 眉间细雪 的CSDN 博客 ，全文地址请点击：https://blog.csdn.net/weixin_42445501/article/details/82792311?utm_source=copy&nbsp;<img src ="http://www.cppblog.com/guijie/aggbug/215981.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2018-09-28 22:00 <a href="http://www.cppblog.com/guijie/archive/2018/09/28/215981.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>[zz] Caffe2和Caffe有何不同？</title><link>http://www.cppblog.com/guijie/archive/2018/06/25/215741.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Sun, 24 Jun 2018 21:13:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2018/06/25/215741.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/215741.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2018/06/25/215741.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/215741.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/215741.html</trackback:ping><description><![CDATA[<p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">Caffe2发布后，外界最多的讨论之一，就是发出上述疑问。去年12月，贾扬清曾经解释过一次：&#8220;目前Caffe2还不能完全替代Caffe，还缺不少东西，例如CuDNN。与Caffe2相比，Caffe仍然是主要的稳定版本，在生产环境中使用仍然推荐Caffe&#8221;。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">现在Caffe2正式发布，这个推荐肯定要改成新版本了。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">Caffe2的基本计算单位是Operator。对于适当数量和类型的输入参数，每个Operator都包括所需的计算逻辑。Caffe和Caffe2的总体差异如下图所示：</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: center; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;"><img src="http://img.mp.itc.cn/upload/20170419/298fe20c58844c8eb152e458fcf98f1c.png" alt="" style="box-sizing: border-box; outline: 0px; margin: 10px auto 0px; max-width: 100%; word-break: break-all; cursor: zoom-in; border: 0px; padding: 0px; display: block;" /></p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">官方提供了从Caffe迁移到Caffe2的教程，据说这个迁移非常简单。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">Caffe2和PyTorch有何不同？</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">这是另外一个疑问。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">Caffe2长于移动和大规模部署。虽然Caffe2新增了支持多GPU的功能，这让新框架与Torch具有了相同的GPU支持能力，但是如前所述，Caffe2支持一台机器上的多个GPU或具有一个及多个GPU的多台机器来进行分布式训练。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;"><span style="color: red;">PyTorch适合进行研究、实验和尝试不同的神经网络</span>；而Caffe2更偏向于工业应用，而且重点关注在移动端上的表现。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">贾扬清现身说法</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">Caffe2发布后，作者贾扬清在reddit上连发四记解答。&#8220;Yangqing here&#8221;，贾扬清一上来就表明了身份。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">　　<img src="http://img.mp.itc.cn/upload/20170419/11efa7fa9ea2438e8a7e338bafc121e2_th.jpeg" alt="" style="box-sizing: border-box; outline: 0px; margin: 10px auto 0px; max-width: 100%; word-break: break-all; cursor: zoom-in; border: 0px; padding: 0px; display: block;" /></p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;"><span style="color: red;">有人问搞出Caffe2意义何在？现在已经有PyTorch、TensorFlow、MXNet等诸多框架。</span></p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">贾扬清说Caffe2和PyTorch团队紧密合作。他们把Caffe2视作一种生产力的选择，<span style="color: red;">而把Torch视作研究型的选择。</span>而在构建AI模块时，他们也持有一种&#8220;非框架&#8221;的理念，例如Gloo、NNPACK和FAISS等可以被用于任何深度学习框架。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">有人问Caffe2接受外部贡献么？</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">贾扬清说大爱外部贡献，也会在开源方面继续努力。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">有人问Caffe2是否用了Torch的代码库，以及CUDA等相关支持的问题。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">贾扬清说他们正在计划让Caffe2和Torch和PyTorch共享后端，这几个框架已经共享Gloo用于分布式训练，THCTensor、THNN和其他C/C++库也将会共享。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">在GPU层面，Caffe2使用了CUDA和CUDNN。贾扬清和团队也试验了OpenCL，但是感觉用NVIDIA的GPU CUDA效果更好。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">另外在其他平台（例如iOS上），Caffe2使用了特定的工具，例如Metal。一两天内，官方会发布Metal的实施。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">有人问Caffe2支持动态图么？</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">贾扬清给出否定的回答，他表示这是Caffe2和PyTorch团队有意做出的选择。Caffe2的任务就是提供最佳的性能，而如果想要极端灵活的计算，请选择PyTorch。贾扬清认为这是一个更好的方式，因为&#8220;一个框架通吃&#8221;可能会影响性能。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">所以，目前Caffe2只支持非常有限的动态控制，例如动态RNN。</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">最后，量子位放出传送门：</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">Caffe2的首页：http://caffe2.ai/</p><p style="box-sizing: border-box; outline: 0px; padding: 10px 0px 20px; margin: 0px; font-size: 16px; color: #191919; line-height: 26px; text-align: justify; word-break: break-all; background-color: #ffffff; border: 0px; font-family: &quot;PingFang SC&quot;, Arial, 微软雅黑, 宋体, simsun, sans-serif;">GitGub地址：https://github.com/caffe2/caffe2</p><br />Reference:<br /><br /><div>https://blog.csdn.net/zchang81/article/details/70316864?utm_source=itdadao&amp;utm_medium=referral<br />阅读记录: read twice<br /></div><img src ="http://www.cppblog.com/guijie/aggbug/215741.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2018-06-25 05:13 <a href="http://www.cppblog.com/guijie/archive/2018/06/25/215741.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>Reviewer</title><link>http://www.cppblog.com/guijie/archive/2018/03/07/215549.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Wed, 07 Mar 2018 14:56:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2018/03/07/215549.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/215549.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2018/03/07/215549.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/215549.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/215549.html</trackback:ping><description><![CDATA[<pre style="white-space: pre-wrap; word-wrap: break-word; width: 946.04px;">As a reminder please note that some wordprocessing systems (e.g. Word) will include a document authors name in an author field for a file. This is important if you are going to upload an attachment from a wordprocessing system with your review. A word file when converted to PDF will have this author information.  Please make sure you do not include this information.  You can check a PDF upload to see if it is included by clicking on file, document properties, summary.  It is also possible to click on the little right pointing triangle above the page slider on the right of the document to get to document summary. The author field should be blank or clearly not the name or any information that could identify the reviewer.</pre><img src ="http://www.cppblog.com/guijie/aggbug/215549.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2018-03-07 22:56 <a href="http://www.cppblog.com/guijie/archive/2018/03/07/215549.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>Topics of KDD 2018 </title><link>http://www.cppblog.com/guijie/archive/2018/02/04/215502.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Sat, 03 Feb 2018 16:32:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2018/02/04/215502.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/215502.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2018/02/04/215502.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/215502.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/215502.html</trackback:ping><description><![CDATA[<table style="background-color: #e4e1d3; padding: 6pt 12pt 2pt; border-radius: 8pt; border-style: solid; border-color: #aaaaaa; margin-top: 10pt; margin-bottom: 10pt; color: #000000; font-family: Verdana, Arial, Helvetica, sans-serif; font-size: 13px; empty-cells: show;"><tbody><tr><td colspan="2" style="padding: 2pt 3pt 1pt;"><strong>Applications</strong></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228873" type="checkbox" />Advertising and E-commerce</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228874" name="topic" />Bioinformatics</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228875" />Education</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228876" name="topic" />Finance</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228877" name="topic" />Healthcare and medicine</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228878" type="checkbox" />Information retrieval and Language</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228879" type="checkbox" />Marketing</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228880" name="topic" />Markets and Crowds</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228881" />Mobile and Sensor devices</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228882" name="topic" />Network sciences</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228883" name="topic" type="checkbox" />Social good</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228884" name="topic" type="checkbox" />Social media and publishing</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228885" />Social sciences</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228886" name="topic" />User modeling</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228887" />Web mining</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228888" name="topic" />Applications--Others</label></td></tr><tr><td colspan="2" style="padding: 2pt 3pt 1pt;"><strong>Big data</strong></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228890" name="topic" />Cloud, Map-Reduce, and MPI</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228891" type="checkbox" />Data streams</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228892" type="checkbox" />Hashing</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228893" type="checkbox" />Infrastructure</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228894" type="checkbox" />Large-scale optimization</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228895" name="topic" />Sampling</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228896" type="checkbox" />Scalable algorithms</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228897" name="topic" type="checkbox" />Big data--Others</label></td></tr><tr><td colspan="2" style="padding: 2pt 3pt 1pt;"><strong>Data mining foundations</strong></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228899" name="topic" />Data ethics</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228900" type="checkbox" />Data mining methodology</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228901" name="topic" type="checkbox" />Design of experiments</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228902" type="checkbox" />Data mining foundations--Others</label></td></tr><tr><td colspan="2" style="padding: 2pt 3pt 1pt;"><strong>Graphs and social networks</strong></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228904" name="topic" />Community detection</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228905" />Graph algorithms</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228906" name="topic" />Influence and diffusion</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228907" />Link prediction</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228908" name="topic" type="checkbox" />Network analysis</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228909" />Social phenomena</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228910" name="topic" />Graphs and social networks--Others</label></td></tr><tr><td colspan="2" style="padding: 2pt 3pt 1pt;"><strong>Knowledge discovery</strong></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228912" name="topic" />Causal inference</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228913" type="checkbox" />Exploratory analysis</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228914" type="checkbox" />Interpretable models</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228915" />Sparse models</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228916" />Visualization</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228917" />Knowledge discovery--Others</label></td></tr><tr><td colspan="2" style="padding: 2pt 3pt 1pt;"><strong>Methods</strong></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228919" name="topic" type="checkbox" />Active learning</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228920" />Bayesian inference</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228921" name="topic" />Decision trees</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228922" type="checkbox" />Ensemble methods</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228923" type="checkbox" />Kernel methods</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228924" name="topic" />Large-margin methods</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228925" name="topic" />Matrix and tensor methods</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228926" name="topic" type="checkbox" />Model selection</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228927" name="topic" />Neural networks and deep learning</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228928" name="topic" />Online learning and bandits</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228929" type="checkbox" />Probabilistic methods and graphical models</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228930" name="topic" />Relational learning</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228931" />Similarity-based methods</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228932" />Methods--Others</label></td></tr><tr><td colspan="2" style="padding: 2pt 3pt 1pt;"><strong>Rich data types</strong></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228934" name="topic" />Relations and structure</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228935" name="topic" />Sequences</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228936" name="topic" />Spatial data</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228937" name="topic" />Temporal and time-series data</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228938" type="checkbox" />Text</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228939" />Unstructured data</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228940" name="topic" type="checkbox" />Mining rich data types--Others</label></td></tr><tr><td colspan="2" style="padding: 2pt 3pt 1pt;"><strong>Recommender systems</strong></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228942" name="topic" />Cold-start</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228943" />Collaborative filtering</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228944" type="checkbox" />Content-based methods</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228945" name="topic" />Recommender systems--Others</label></td></tr><tr><td colspan="2" style="padding: 2pt 3pt 1pt;"><strong>Security</strong></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228947" type="checkbox" />Adversarial data mining</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228948" name="topic" />Anomaly detection</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228949" name="topic" />Anonymization</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228950" />Fraud detection</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228951" type="checkbox" />Intrusion detection</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228952" name="topic" type="checkbox" />Privacy in data mining</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228953" type="checkbox" />Spam detection</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228954" name="topic" type="checkbox" />Trust and truthfulness</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228955" name="topic" type="checkbox" />Security--Others</label></td></tr><tr><td colspan="2" style="padding: 2pt 3pt 1pt;"><strong>Supervised learning</strong></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228957" name="topic" type="checkbox" />Classification</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228958" />Learning to rank</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228959" name="topic" type="checkbox" />Multi-label learning</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228960" type="checkbox" />Regression</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228961" name="topic" />Semi-supervised learning</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228962" name="topic" type="checkbox" />Structured output prediction</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228963" />Transfer learning</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228964" />Supervised learning--Others</label></td></tr><tr><td colspan="2" style="padding: 2pt 3pt 1pt;"><strong>Unsupervised learning</strong></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228966" />Clustering</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input name="topic" value="228967" type="checkbox" />Dimension reduction</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228968" name="topic" />Embeddings</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228969" />Feature selection</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228970" name="topic" />Manifold learning</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input value="228971" name="topic" type="checkbox" />Matrix/tensor factorization</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228972" />Rule and pattern mining</label></td><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" value="228973" name="topic" />Topic and latent variable models</label></td></tr><tr><td style="padding: 1pt 10pt 1pt 3pt;"><label><input type="checkbox" name="topic" value="228974" />Unsupervised learning--Others</label></td></tr></tbody></table>Read twice<img src ="http://www.cppblog.com/guijie/aggbug/215502.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2018-02-04 00:32 <a href="http://www.cppblog.com/guijie/archive/2018/02/04/215502.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>2017年高被引学者榜单</title><link>http://www.cppblog.com/guijie/archive/2018/01/23/215482.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Mon, 22 Jan 2018 20:50:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2018/01/23/215482.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/215482.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2018/01/23/215482.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/215482.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/215482.html</trackback:ping><description><![CDATA[<div>http://china.elsevier.com/ElsevierDNN/Portals/7/mostcited/2017/computer-science.html<br /><br /><br /><div><div most-cited=""  pt-4"=""><h3>计算机科学</h3><table table-hover"=""><thead><tr><th>学者姓名</th><th>目前工作单位</th></tr></thead><tbody><tr><td>刘铁岩</td><td>微软亚洲研究院</td></tr><tr><td>王建勇</td><td>清华大学</td></tr><tr><td>Stojmenović, Ivan</td><td>清华大学</td></tr><tr><td>白翔</td><td>华中科技大学</td></tr><tr><td>蔡登</td><td>浙江大学</td></tr><tr><td>蔡开元</td><td>北京航空航天大学</td></tr><tr><td>曹珍富</td><td>上海交通大学</td></tr><tr><td>曾志刚</td><td>华中科技大学</td></tr><tr><td>常虹</td><td>中国科学院大学</td></tr><tr><td>陈兵</td><td>青岛大学</td></tr><tr><td>陈积明</td><td>浙江大学</td></tr><tr><td>陈敏</td><td>华中科技大学</td></tr><tr><td>陈清江</td><td>西安建筑科技大学</td></tr><tr><td>陈胜勇</td><td>浙江工业大学</td></tr><tr><td>陈松灿</td><td>南京航空航天大学</td></tr><tr><td>陈天平</td><td>复旦大学</td></tr><tr><td>陈月辉</td><td>济南大学</td></tr><tr><td>程明明</td><td>南开大学</td></tr><tr><td>仇计清</td><td>河北科技大学</td></tr><tr><td>邓勇</td><td>西南大学</td></tr><tr><td>丁永生</td><td>东华大学</td></tr><tr><td>樊建席</td><td>苏州大学</td></tr><tr><td>樊治平</td><td>东北大学</td></tr><tr><td>范平志</td><td>西南交通大学</td></tr><tr><td>冯国灿</td><td>中山大学</td></tr><tr><td>高曙明</td><td>浙江大学</td></tr><tr><td>高铁杠</td><td>南开大学</td></tr><tr><td>高文</td><td>北京大学</td></tr><tr><td>高小山</td><td>中国科学院</td></tr><tr><td>高新波</td><td>西安电子科技大学</td></tr><tr><td>公茂果</td><td>西安电子科技大学</td></tr><tr><td>关新平</td><td>上海交通大学</td></tr><tr><td>管晓宏</td><td>西安交通大学</td></tr><tr><td>郭振华</td><td>清华大学</td></tr><tr><td>韩敏</td><td>大连理工大学</td></tr><tr><td>何晓飞</td><td>浙江大学</td></tr><tr><td>黑晓军</td><td>华中科技大学</td></tr><tr><td>侯增广</td><td>中国科学院</td></tr><tr><td>胡包钢</td><td>中国科学院</td></tr><tr><td>胡德文</td><td>国防科学技术大学</td></tr><tr><td>胡清华</td><td>天津大学</td></tr><tr><td>胡事民</td><td>清华大学</td></tr><tr><td>胡卫明</td><td>中国科学院</td></tr><tr><td>黄德双</td><td>同济大学</td></tr><tr><td>黄继武</td><td>中山大学</td></tr><tr><td>黄晓霞</td><td>北京科技大学</td></tr><tr><td>江健民</td><td>深圳大学</td></tr><tr><td>江涛</td><td>华中科技大学</td></tr><tr><td>姜大昕</td><td>微软亚洲研究院</td></tr><tr><td>焦李成</td><td>西安电子科技大学</td></tr><tr><td>金海</td><td>华中科技大学</td></tr><tr><td>李传东</td><td>西南大学</td></tr><tr><td>李春光</td><td>浙江大学</td></tr><tr><td>李登峰</td><td>福州大学</td></tr><tr><td>李国良</td><td>清华大学</td></tr><tr><td>李涵雄</td><td>中南大学</td></tr><tr><td>李洪兴</td><td>大连理工大学</td></tr><tr><td>李树涛</td><td>湖南大学</td></tr><tr><td>李学龙</td><td>中国科学院</td></tr><tr><td>李玉霞</td><td>山东科技大学</td></tr><tr><td>李远清</td><td>华南理工大学</td></tr><tr><td>李子青</td><td>中国科学院</td></tr><tr><td>梁吉业</td><td>山西大学</td></tr><tr><td>梁金玲</td><td>东南大学</td></tr><tr><td>梁艳春</td><td>吉林大学</td></tr><tr><td>廖晓峰</td><td>西南大学</td></tr><tr><td>林闯</td><td>清华大学</td></tr><tr><td>刘宝碇</td><td>清华大学</td></tr><tr><td>刘成林</td><td>中国科学院</td></tr><tr><td>刘德荣</td><td>中国科学院</td></tr><tr><td>刘利民</td><td>中国工程物理研究院</td></tr><tr><td>刘培德</td><td>山东财经大学</td></tr><tr><td>刘庆山</td><td>东南大学</td></tr><tr><td>刘彦奎</td><td>河北大学</td></tr><tr><td>楼旭阳</td><td>江南大学</td></tr><tr><td>卢宏涛</td><td>上海交通大学</td></tr><tr><td>鲁耀斌</td><td>华中科技大学</td></tr><tr><td>马书根</td><td>天津大学</td></tr><tr><td>马毅</td><td>上海科技大学</td></tr><tr><td>马宗民</td><td>东北大学</td></tr><tr><td>米据生</td><td>河北师范大学</td></tr><tr><td>潘林强</td><td>华中科技大学</td></tr><tr><td>潘全科</td><td>东北大学</td></tr><tr><td>庞彦伟</td><td>天津大学</td></tr><tr><td>彭晨</td><td>上海大学</td></tr><tr><td>彭怡</td><td>电子科技大学</td></tr><tr><td>钱宇华</td><td>山西大学</td></tr><tr><td>任丰原</td><td>清华大学</td></tr><tr><td>阮邦志</td><td>北京师范大学-香港浸会大学联合国际学院</td></tr><tr><td>芮勇</td><td>联想集团</td></tr><tr><td>沈纲祥</td><td>苏州大学</td></tr><tr><td>沈琳琳</td><td>深圳大学</td></tr><tr><td>石勇</td><td>中国科学院</td></tr><tr><td>时小虎</td><td>吉林大学</td></tr><tr><td>孙仕亮</td><td>华东师范大学</td></tr><tr><td>谭松波</td><td>中国科学院</td></tr><tr><td>谭铁牛</td><td>中国科学院</td></tr><tr><td>谭晓阳</td><td>南京航空航天大学</td></tr><tr><td>唐杰</td><td>清华大学</td></tr><tr><td>唐小虎</td><td>西南交通大学</td></tr><tr><td>陶文兵</td><td>华中科技大学</td></tr><tr><td>佟绍成</td><td>辽宁工业大学</td></tr><tr><td>王聪</td><td>华南理工大学</td></tr><tr><td>王飞跃</td><td>中国科学院</td></tr><tr><td>王国军</td><td>中南大学</td></tr><tr><td>王国胤</td><td>重庆邮电大学</td></tr><tr><td>王瀚漓</td><td>同济大学</td></tr><tr><td>王怀清</td><td>南方科技大学</td></tr><tr><td>王亮</td><td>中国科学院</td></tr><tr><td>王熙照</td><td>河北大学</td></tr><tr><td>王兴伟</td><td>东北大学</td></tr><tr><td>王绪柱</td><td>太原理工大学</td></tr><tr><td>王雪</td><td>清华大学</td></tr><tr><td>王应明</td><td>福州大学</td></tr><tr><td>卫贵武</td><td>四川师范大学</td></tr><tr><td>文福栓</td><td>浙江大学</td></tr><tr><td>邬向前</td><td>哈尔滨工业大学</td></tr><tr><td>吴伟志</td><td>浙江海洋大学</td></tr><tr><td>吴争光</td><td>浙江大学</td></tr><tr><td>伍世虔</td><td>江西财经大学</td></tr><tr><td>夏锋</td><td>大连理工大学</td></tr><tr><td>夏又生</td><td>福州大学</td></tr><tr><td>肖迪</td><td>重庆大学</td></tr><tr><td>徐勇</td><td>哈尔滨工业大学</td></tr><tr><td>徐泽水</td><td>四川大学</td></tr><tr><td>徐正元</td><td>中国科学技术大学</td></tr><tr><td>杨苏</td><td>华南理工大学</td></tr><tr><td>殷允强</td><td>昆明理工大学</td></tr><tr><td>于永光</td><td>北京交通大学</td></tr><tr><td>余乐安</td><td>北京化工大学</td></tr><tr><td>俞立</td><td>浙江工业大学</td></tr><tr><td>喻俊志</td><td>中国科学院</td></tr><tr><td>袁晓辉</td><td>华中科技大学</td></tr><tr><td>詹志辉</td><td>华南理工大学</td></tr><tr><td>张道强</td><td>南京航空航天大学</td></tr><tr><td>张化光</td><td>东北大学</td></tr><tr><td>张敏灵</td><td>东南大学</td></tr><tr><td>张强</td><td>大连大学</td></tr><tr><td>张师超</td><td>广西师范大学</td></tr><tr><td>张田昊</td><td>上海交通大学</td></tr><tr><td>张新鹏</td><td>上海大学</td></tr><tr><td>张雨浓</td><td>中山大学</td></tr><tr><td>张煜东</td><td>南京师范大学</td></tr><tr><td>章毅</td><td>四川大学</td></tr><tr><td>章毓晋</td><td>清华大学</td></tr><tr><td>周东华</td><td>山东科技大学</td></tr><tr><td>周根贵</td><td>浙江工业大学</td></tr><tr><td>周杰</td><td>清华大学</td></tr><tr><td>周昆</td><td>浙江大学</td></tr><tr><td>周涛</td><td>电子科技大学</td></tr><tr><td>周志华</td><td>南京大学</td></tr><tr><td>诸葛海</td><td>中国科学院</td></tr><tr><td>祝峰</td><td>电子科技大学</td></tr></tbody></table></div></div><hr /><footer><p>Copyright &#169; 2018 Elsevier. All rights reserved.</p></footer></div><img src ="http://www.cppblog.com/guijie/aggbug/215482.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2018-01-23 04:50 <a href="http://www.cppblog.com/guijie/archive/2018/01/23/215482.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>Professor Deng Cai's code, very good code</title><link>http://www.cppblog.com/guijie/archive/2017/11/26/215370.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Sun, 26 Nov 2017 05:10:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2017/11/26/215370.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/215370.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2017/11/26/215370.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/215370.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/215370.html</trackback:ping><description><![CDATA[<span style="color: #212121; font-family: wf_segoe-ui_normal, &quot;Segoe UI&quot;, &quot;Segoe WP&quot;, Tahoma, Arial, sans-serif, serif, EmojiFont; font-size: 15px; background-color: #ffffff;">20171125 rutgers&nbsp;email: Please check&nbsp;</span><a href="https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fdengcai78%2FMatlabFunc&amp;data=02%7C01%7Cjie.gui%40rutgers.edu%7Ccc91a8dece38489b84c708d534888d5f%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C0%7C636472683460873626&amp;sdata=O8pd0ZRw5nRMt7QFQiVCYCDLcRwKhYDjmlzpZ3LPmy0%3D&amp;reserved=0" target="_blank" rel="noopener noreferrer" style="font-family: wf_segoe-ui_normal, &quot;Segoe UI&quot;, &quot;Segoe WP&quot;, Tahoma, Arial, sans-serif, serif, EmojiFont; font-size: 15px; background-color: #ffffff;">https://github.com/dengcai78/MatlabFunc</a><span style="color: #212121; font-family: wf_segoe-ui_normal, &quot;Segoe UI&quot;, &quot;Segoe WP&quot;, Tahoma, Arial, sans-serif, serif, EmojiFont; font-size: 15px; background-color: #ffffff;">&nbsp; for the most up-to-date codes.</span><img src ="http://www.cppblog.com/guijie/aggbug/215370.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2017-11-26 13:10 <a href="http://www.cppblog.com/guijie/archive/2017/11/26/215370.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>价值250亿美元的特征向量：Google背后的线性代数</title><link>http://www.cppblog.com/guijie/archive/2017/05/02/214895.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Tue, 02 May 2017 00:03:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2017/05/02/214895.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/214895.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2017/05/02/214895.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/214895.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/214895.html</trackback:ping><description><![CDATA[<div id="weibo-4102482772109011" style="box-sizing: border-box; position: relative; font-size: 12px; background: #eeeeee; padding: 5px; word-wrap: break-word; margin-bottom: 10px; color: #3b3b3b; font-family: &quot;Open Sans&quot;, Arial, Helvetica, sans-serif;"><div style="box-sizing: border-box; margin-bottom: 10px; margin-left: 0px;"><div style="box-sizing: border-box; font-size: 16px; margin: 5px 5px 10px;">【价值250亿美元的特征向量：Google背后的线性代数】《The $25,000,000,000 eigenvector: The linear algebra behind Google》K Bryan, T Leise (2006)&nbsp;<a href="http://t.cn/zlnwlj5" target="_blank" title="http://www.rose-hulman.edu/~bryan/googleFinalVersionFixed.pdf" rel="nofollow" data-slimstat-clicked="false" data-slimstat-type="0" data-slimstat-tracking="true" data-slimstat-callback="false" style="box-sizing: border-box; color: #3b3b3b; text-decoration-line: none; outline: none !important;">http://t.cn/zlnwlj5</a>&nbsp;&#8203;</div><div style="box-sizing: border-box; padding: 10px; max-height: 420px; overflow: hidden; margin-bottom: 10px;"><a target="_blank" title="点击查看大图" href="http://ww3.sinaimg.cn/large/5396ee05jw1ff5gpzdcfgj21dw07gdoj.jpg" rel="nofollow" data-slimstat-clicked="false" data-slimstat-type="0" data-slimstat-tracking="true" data-slimstat-callback="false" style="box-sizing: border-box; color: #95a5a6; text-decoration-line: none; outline: 0px;"><img src="http://ww3.sinaimg.cn/large/5396ee05jw1ff5gpzdcfgj21dw07gdoj.jpg" alt="" style="box-sizing: border-box; border: 0px; vertical-align: middle; max-width: 260px; height: auto; padding: 10px 0px;" /></a></div><p style="box-sizing: border-box; margin: 0px; line-height: 22px; padding-bottom: 0px;"></p></div><p style="box-sizing: border-box; margin: 0px; line-height: 22px; padding-bottom: 0px;"></p></div><div id="4102486203659995" style="box-sizing: border-box; margin-left: 0px; position: relative; font-size: 12px; background: #fefcff; padding: 5px; margin-bottom: 10px; color: #3b3b3b; font-family: &quot;Open Sans&quot;, Arial, Helvetica, sans-serif;"><div style="box-sizing: border-box; margin: 5px;"><span style="box-sizing: border-box; font-size: 14px; padding: 0px;"><a title="functicons 的主页" href="http://www.weibo.com/n/functicons" target="_blank" rel="nofollow" data-slimstat-clicked="false" data-slimstat-type="0" data-slimstat-tracking="true" data-slimstat-callback="false" style="box-sizing: border-box; color: #3b3b3b; text-decoration-line: none; outline: none !important;"><strong style="box-sizing: border-box;">functicons</strong></a></span>&nbsp;<span style="box-sizing: border-box;"><a href="http://www.weibo.com/1822142792/F12G0fm8b" target="_blank" rel="nofollow" data-slimstat-clicked="false" data-slimstat-type="0" data-slimstat-tracking="true" data-slimstat-callback="false" style="box-sizing: border-box; color: #3b3b3b; text-decoration-line: none; outline: none !important;">网页版</a>&nbsp;转发于<span style="box-sizing: border-box;">2017-05-01 06:46</span></span></div><div style="box-sizing: border-box; font-size: 16px; margin: 5px 5px 10px;">The weights of the web pages is the eigenvector of the link matrix. This is mathematically simple and beautiful, the real challenge is the scale of the matrix: 10B * 10B, but because it&#8217;s sparse, the real scale is 10B * c.</div></div><img src ="http://www.cppblog.com/guijie/aggbug/214895.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2017-05-02 08:03 <a href="http://www.cppblog.com/guijie/archive/2017/05/02/214895.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>[zz] 跟班式科研，误己误国——某国立研究所所长的自白 | 争鸣</title><link>http://www.cppblog.com/guijie/archive/2016/08/26/214232.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Fri, 26 Aug 2016 01:23:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2016/08/26/214232.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/214232.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2016/08/26/214232.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/214232.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/214232.html</trackback:ping><description><![CDATA[&nbsp;&nbsp;&nbsp;&nbsp; 摘要: http://mp.weixin.qq.com/s?__biz=MzIyNDA2NTI4Mg==&amp;mid=2655408987&amp;idx=1&amp;sn=aeb8266a5bd6db4f35e7e3f7bdf8c837&amp;scene=1&amp;srcid=0808iEWai3igwfXbkcGtq2iW&amp;from=groupmessage&amp;isappinst...&nbsp;&nbsp;<a href='http://www.cppblog.com/guijie/archive/2016/08/26/214232.html'>阅读全文</a><img src ="http://www.cppblog.com/guijie/aggbug/214232.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2016-08-26 09:23 <a href="http://www.cppblog.com/guijie/archive/2016/08/26/214232.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>全新的美国计算机学科排名</title><link>http://www.cppblog.com/guijie/archive/2016/07/21/214021.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Thu, 21 Jul 2016 04:40:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2016/07/21/214021.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/214021.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2016/07/21/214021.html#Feedback</comments><slash:comments>1</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/214021.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/214021.html</trackback:ping><description><![CDATA[<div w_f14"="" node-type="feed_list_content" nick-name="王威廉">转自王威廉20160720微博：<br />全新的美国计算机学科排名，号称与US News的系主任主观打分排名不同，这项排名针对计算机领域最顶尖的会议计算而来。CMU总体排名第一，拥有123名教授。UC Santa Barbara在少于30名Faculty的小型计算机系中仅次于哈佛，与普林斯顿并列第二。<a title="网页链接" href="http://t.cn/RtZOAuB" target="_blank" suda-uatrack="key=minicard&amp;value=pagelink_minicard_click" action-type="feed_list_url"><em ficon_cd_link"="">O</em>网页链接:<a title="网页链接" href="http://t.cn/RtZOAuB" target="_blank" suda-uatrack="key=minicard&amp;value=pagelink_minicard_click" action-type="feed_list_url"><div style="display: inline !important;">http://csrankings.org/</div></a><br /><img src="http://www.cppblog.com/images/cppblog_com/guijie/a.jpg" width="565" height="614" alt="" /><br /></a></div><img src ="http://www.cppblog.com/guijie/aggbug/214021.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2016-07-21 12:40 <a href="http://www.cppblog.com/guijie/archive/2016/07/21/214021.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>[zz] 现在主要的期刊分区有几种？</title><link>http://www.cppblog.com/guijie/archive/2016/06/15/213721.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Wed, 15 Jun 2016 11:51:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2016/06/15/213721.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/213721.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2016/06/15/213721.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/213721.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/213721.html</trackback:ping><description><![CDATA[<a href="http://blog.sciencenet.cn/blog-303458-804525.html">http://blog.sciencenet.cn/blog-303458-804525.html</a><br /><br /><strong><font size="4" face="楷体">现在主要的期刊分区有几种？</font></strong> 
<p style="text-align: left; margin: 0px 0px 10px 24px; line-height: 27px; text-indent: 37px; -ms-layout-grid-mode: char"><span style="font-size: 19px; font-family: 楷体">答：有两种。</span></p>
<p style="margin-bottom: 10px; font-size: 14px; font-weight: normal; color: rgb(0,0,0); font-style: normal; text-align: left; margin-top: 0px; line-height: 27px; -ms-layout-grid-mode: char"><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">&nbsp; &nbsp;</span><strong><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">A、JCR</span><span style="font-size: 19px; font-family: 楷体"><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">（又称</span><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">汤森路透）分区法</span></span></strong></p>
<p style="font-family: ; color: rgb(0,0,0); text-align: left; line-height: 27px" dir="ltr">&nbsp; &nbsp;<a name="OLE_LINK1"><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">汤森路透</span><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai"></span></a><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">（</span><span style="font-size: 18px; font-family: times new roman">Thomson Reuters</span><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">）每年出版一本《期刊引用报告》（</span><span style="font-size: 18px; font-family: times new roman">Journal Citation Reports</span><span style="font-size: 19px; font-family: 楷体"><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">，简称JCR）。JCR对8600多种SCI期刊的影响因子</span></span><span style="font-size: 19px; font-family: 楷体"><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai"></span></span><span style="font-size: 19px; font-family: 楷体"><a href="http://blog.sciencenet.cn/#_ftn1" name="_ftnref1"><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">[1]</span></a><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">（Impact Factor）等指数加以统计。</span></span><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">JCR</span><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">将收录期刊分为176个不同学科类别。每个学科分类按照期刊的影响因子高低，平均分为Q1、Q2、Q3和Q4四个区：<br /></span><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">&nbsp;</span><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">&nbsp;</span><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai; color: red">各学科分类中影响因子前25%(含25%)期刊划分为Q1区、前25-50% (含50%)为Q2区、前50-75% (含75% )为Q3区、后75%为Q4区。</span></p>
<p style="font-family: ; color: rgb(0,0,0); text-align: left; line-height: 27px"><span style="font-size: 19px; font-family: 楷体"><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">&nbsp;</span><strong><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">B、</span><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">中国科学院分区法</span></strong><br /><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">&nbsp; &nbsp;</span></span><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">中国科学院国家科学图书馆世界科学前沿分析中心（原中国科学院文献情报中心）</span><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">根据</span><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">汤森路透</span><span style="font-size: 18px; font-family: 楷体,楷体_gb2312, simkai">每年的JCR数据，创新划分了一个分区区间，形成了中科院的分区标准。</span></p><img src ="http://www.cppblog.com/guijie/aggbug/213721.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2016-06-15 19:51 <a href="http://www.cppblog.com/guijie/archive/2016/06/15/213721.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>Autoencoder</title><link>http://www.cppblog.com/guijie/archive/2016/06/06/213654.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Mon, 06 Jun 2016 09:05:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2016/06/06/213654.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/213654.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2016/06/06/213654.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/213654.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/213654.html</trackback:ping><description><![CDATA[Read twice <a href="http://deeplearning.stanford.edu/wiki/index.php/Autoencoders_and_Sparsity">http://deeplearning.stanford.edu/wiki/index.php/Autoencoders_and_Sparsity</a>&nbsp;和 http://deeplearning.stanford.edu/wiki/index.php/Visualizing_a_Trained_Autoencoder 的中文版本，understand completely<br />点击最下面有中文。<br />This is recommended by Chong Wang in iim.<img src ="http://www.cppblog.com/guijie/aggbug/213654.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2016-06-06 17:05 <a href="http://www.cppblog.com/guijie/archive/2016/06/06/213654.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>What is the difference between weakly supervised learning and semi-supervised learning?</title><link>http://www.cppblog.com/guijie/archive/2016/05/25/213585.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Wed, 25 May 2016 15:07:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2016/05/25/213585.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/213585.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2016/05/25/213585.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/213585.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/213585.html</trackback:ping><description><![CDATA[<div>&nbsp;通常弱监督和半监督都是对于两种label而言的&nbsp;如果只有一种label&nbsp;不存在弱监督的概念。比如物体检测 label是类别和位置 如果只给定类别不给定位置 相当于只用了偏弱的label 叫做弱监督。如果类别和位置都只用了部分样本 叫做半监督。如果label只有一种 比如物体分类 没有弱监督的概念 当只用部分样本时 叫做半监督问题。This is with Chong Wang's help and verified by Jingyu Liu.</div><img src ="http://www.cppblog.com/guijie/aggbug/213585.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2016-05-25 23:07 <a href="http://www.cppblog.com/guijie/archive/2016/05/25/213585.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>数据挖掘领域的全球专家列表</title><link>http://www.cppblog.com/guijie/archive/2016/03/04/212920.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Fri, 04 Mar 2016 02:16:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2016/03/04/212920.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/212920.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2016/03/04/212920.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/212920.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/212920.html</trackback:ping><description><![CDATA[<span style="color: #3b3b3b; font-family: 'Open Sans', Arial, Helvetica, sans-serif; font-size: 16px; line-height: 22.8571px; background-color: #eeeeee;">发布一个数据挖掘领域的全球专家列表，</span><a href="http://t.cn/RGlEOWy" target="_blank" title="https://aminer.org/datamining-experts" rel="nofollow" data-slimstat-clicked="false" data-slimstat-type="0" data-slimstat-tracking="true" data-slimstat-callback="false" data-slimstat-async="true" style="box-sizing: border-box; color: #3b3b3b; text-decoration: none; font-family: 'Open Sans', Arial, Helvetica, sans-serif; font-size: 16px; line-height: 22.8571px; outline: none !important; background-color: #eeeeee;">http://t.cn/RGlEOWy</a><span style="color: #3b3b3b; font-family: 'Open Sans', Arial, Helvetica, sans-serif; font-size: 16px; line-height: 22.8571px; background-color: #eeeeee;">&nbsp;包含300多位专家，既有学术界的，也有工业界的，提供每位专家详细profile（基本信息、联系方式、研究兴趣），还包括性别、能讲的语言（比如中文）。更多专家列表将随后发布。</span><img src ="http://www.cppblog.com/guijie/aggbug/212920.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2016-03-04 10:16 <a href="http://www.cppblog.com/guijie/archive/2016/03/04/212920.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>刘铁岩：在微软大学的三次华丽转型</title><link>http://www.cppblog.com/guijie/archive/2016/01/21/212701.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Thu, 21 Jan 2016 02:05:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2016/01/21/212701.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/212701.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2016/01/21/212701.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/212701.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/212701.html</trackback:ping><description><![CDATA[<div>http://blog.sina.cn/dpool/blog/s/blog_4caedc7a0102w57s.html?wm=3049_a111<br /><br />一个理想的研究人员成长轨迹应该是什么样的？<br />微软全球执行副总裁沈向洋博士认为一个酷酷的研究员应该是这样的：&#8220;挑选一个雄心勃勃的目标，致力于端到端的研究，长久的坚持，而他的研究伙伴们也应该有着同样的激情，但最重要的是始终乐在其中。&#8221;<br />如果以这个标准来看，微软亚洲研究院首席研究员<a href="http://research.microsoft.com/en-us/people/tyliu/" target="_blank">刘铁岩</a>博士可谓是研究员的范本。2003年，从清华大学电子工程系博士毕业之后，刘铁岩直接进入了微软亚洲研究院，在这一待就是十余年。这十多年间，刘铁岩博士由原本的多媒体信号处理方向的博士，逐步成长为国际机器学习和信息检索领域的知名学者。这些转型在外人看起来十分巨大，但&#8220;三清&#8221;（本科、硕士和博士都就读于清华大学）出身的刘铁岩博士说：&#8220;微软亚洲研究院其实是我的第二所大学，这是一个培养人的地方，有了她的帮助，这一切其实过渡地非常自然。&#8221;<br />
<div><strong>开放环境带来的首次转型</strong><br />在结束了九年的清华校园学习时，摆在刘铁岩面前的选择有很多，例如知名大学教职等等，而微软亚洲研究院吸引刘铁岩的除了全球领先的研究环境之外，更重要的是可以和自己敬仰已久的顶尖行业大牛一起工作，对于一个刚开始入行的年轻研究员来说无疑动力巨大。因此，刘铁岩于2003年正式加入了微软亚洲研究院，并由原来的多媒体信号处理方向的研究转入了互联网搜索与挖掘领域，从此开始了对信息检索这一全新领域的探索。<br />这是刘铁岩在研究院的第一次转型，但这次转型并不像人们想象的那么艰难，因为微软亚洲研究院为研究人员提供了一个十分开放的科研环境，让研究员们有充分的自由和资源来调整自己的研究兴趣。在这里刘铁岩和很多不同研究方向的资深研究员们进行了交流，其中包括他后来的老板，现在的<a href="http://www.msra.cn/zh-cn/about/leadership/wei-ying-ma.aspx" target="_blank">微软亚洲研究院常务副院长马维英博士</a>。同时，借助研究院这个平台刘铁岩还与众多国际知名学者进行了深入交流，进一步拓宽了其科研视野，刘铁岩博士首次转型的领路人便是卡内基梅隆大学的文本分类领域的资深专家杨颐明授。2004年暑假，正处于转型期的刘铁岩博士遇见了前来微软亚洲研究院交流的杨教授，便一拍即合地展开了合作。他们当时共同搭建了当时世界上最大的、近三十万类的文本分类系统，相关论文收到了广泛关注，短短几年间就被引用了数百次。这次和杨教授的合作也成了刘铁岩进入到文本信息处理领域的第一个敲门砖。从那个时候起，刘铁岩开始了解什么是信息检索，什么是文本分类系统，他的首次转型也逐步成型。<br />
<div><strong>挑选一个雄心勃勃的目标：排序学习</strong><br />第一次转型之后，刘铁岩作为信息检索领域的新人，始终保持着旺盛的好奇心，不断思考着能为这个领域带来哪些新东西。当时围绕搜索引擎所开展的研究十分火热，信息检索更是人们关注的重中之重。通过大量的文献研究，刘铁岩发现这个方向大多数的研究者都是数字图书馆专业背景，因此研究方法都偏向经验化，缺少了对于优化系统方式和目标的科学思考。<br />基于对行业的洞察，刘铁岩开始深入学习机器学习的相关知识，并试图把机器学习的思想引入信息检索领域。由此，刘铁岩博士在学术界的第一个成名工作&#8212;&#8212;排序学习（learning to rank）就这样诞生了，该方法为信息检索领域带来了重大变革。<br />随后，刘铁岩的研究便围绕排序学习展开。在2007到2008年，刘铁岩和他的团队在SIGIR、WWW、ICML等顶级学术会议上发表了大量的关于排序学习的论文，还在主流会议上做主题讲座、主持专题研讨会。他的表现受到了学术界越来越多的关注，更多的研究人员跟随他进入到这个领域中来，短短的几年时间刘铁岩及其团队的研究实力便在全世界的信息检索领域内遥遥领先。而刘铁岩博士出的第一本学术专著也与排序学习相关。该专著已被多所大学作为教科书、并被其他学者引用了近千次。<br />致力于端到端的系统性研究 在2008年到2009年左右，排序学习领域尽管很繁荣，但是多数人仍把排序学习作为应用级的研究。在机器学习领域的主流学术会议中，排序学习通常也会被分到应用领域（application track）。<br />刘铁岩很快就发现了这其中的原因：一个研究领域如果缺少科研理论的话，是无法被广泛认可的。因此在后来的几年时间里，刘铁岩和他的研究团队花费了大量时间从理论的角度把排序学习领域正式化，去阐述这个领域是什么、目标是什么、各种算法的关系是什么、有什么样的理论性质等等。他们在ICML、NIPS、COLT等顶级机器学习会议上发表了大量排序学习的理论文章，即使到今天这些论文的影响力也十分深刻。在这整个的研究周期内，刘铁岩及其团队把排序学习打造成一个完整的研究领域，并通过从算法到理论的一系列研究成果，让这个领域真正的火了起来，刘铁岩也成了这一研究领域当之无愧的代表人物。<br />这就是微软亚洲研究院里一个典型的研究案例。刘铁岩在微软内部的导师Rakesh Agrawal院士曾告诉他：&#8220;<span style="color: red">对于研究人员来说，并不是为了发表论文而发论文，而是要在特定的历史阶段，针对一个重要的问题，从表面到核心全部做到位</span>。&#8221;一直到今天，排序学习一直都是很多会议的主要方向之一，仍然有很多学者在进行研究。正是因为这些工作，刘铁岩博士完成了他的第二次转型&#8212;&#8212;由信息检索转变到了机器学习。<br />
<div><strong>第三次转型：博弈机器学习</strong><br />在微软亚洲研究院，研究员的研究成果除了作为论文发表出来之外，还会应用到微软的各个产品中。通过与产品部门合作，研究员们可以发现实际应用中的新问题。刘铁岩团队与微软的在线广告部门的合作就是其中一个非常有代表性的实例。<br />这项合作始于排序学习，刘铁岩和团队成员帮微软广告部门离线训练了一个效果极佳的机器学习模型用于必应广告搜索中的竞价排名。上线之初模型立刻带来了很大的效益，但随着时间的推移，广告效益却大打折扣。刘铁岩和他的团队发现了这个问题，并找到了奇怪现象的根源：广告竞价排名过程常常涉及到人（广告主）的因素，广告主会因为算法的改变带来的价格变化，敏锐地调整自己的广告投放策略，这是一个动态过程。如果不考虑经济规律和人的动态策略，离线地进行机器学习模型的训练，结果自然会产生很大的偏差。<br />如果想把广告竞价这个动态问题解释清楚，仅有机器学习的知识背景显然是不够的。所以刘铁岩便带领其团队开始学习博弈论，计算经济学等等，<span style="color: red">组名也改成了&#8220;互联网经济研究组&#8221;，这便是他第三次转型的开始</span>。在这个转型过程中，他发明了一种全新的技术，称为&#8220;博弈机器学习&#8221;，把博弈论的思想引入到机器学习的过程中，来对人的动态策略进行建模，从而解决上文提到的难题。<br />如果你了解博弈论和机器学习分别是什么的话，就会发现这两个领域差别巨大，完全是不同的体系，那么这次转型的难度也可想而知。刘铁岩博士说：&#8220;对于任何一位研究人员，如果不是在微软亚洲研究院的话，这种转型都是非常困难的。因为，如果你开始学习新东西，想要有这个领域的人认识、认可你，并产生顶级的影响力是十分艰难的。但当我们真正去做的时候，发现微软亚洲研究院给了我们很多帮助，这让我们对新领域的研究变得轻松不少。&#8221;当刘铁岩和他们组的研究员们开始涉足互联网经济领域时，不仅有来自微软其他研究院在博弈论领域颇有建树的同事（如Noam Nisan）的帮助、也有很多来自学界的博弈论专家（如邓小铁教授、叶荫宇教授等）抛出了橄榄枝。他们互相访问，一起参加各种学术活动，互相交流，在很短的时间内，刘铁岩他们就对博弈论这一研究方向有了很多深刻的认识：不仅在算法博弈论领域的顶级会议上发表了多篇论文，还在互联网经济研究组成立不到两年的时间里，以程序委员会主席的身份把全世界第二的算法博弈论会议&#8212;&#8212;互联网经济大会（WINE）带到中国。<br />
<div>黄金三镖客：电子，数学和计算机 &nbsp;微软亚洲研究院 人工智能组三次转型，成就了刘铁岩博士一路创新不断的探索和发现，然而这背后也离不开其整个研究团队的支持与努力。<span style="color: #ff0000">现在，刘铁岩博士带领的团队更名为&#8220;人工智能组&#8221;</span>，继续在当下火热的机器学习和人工智能领域进行深耕。不久前，<a href="http://blog.sina.cn/dpool/blog/s/blog_4caedc7a0102w04g.html?vt=4" target="_blank">微软亚洲研究院对外开源的DMTK（分布式机器学习工具包）</a>便是这个小组的研究成果。</div>如果给这个研究组寻找一个关键词的话，那一定是&#8220;求知欲&#8221;。从刘铁岩的三次转型中也不难发现，现名为人工智能组的研究员们绝非循规蹈矩之人，他们有着强烈的求知欲，就像初生牛犊不怕虎一样，<span style="color: red">知难而进，什么不会学什么，什么难做什么</span>，朝气十足。<br />而另一方面，该团队的组合十分有趣，就像微软亚洲研究院的一个小小缩影一样。研究员们的专业覆盖面既不是全部精钻于机器学习，也不是全部埋头在博弈论上。目前，人工智能组有三分之一的研究员出自数学系，专业包括计算数学、概率论和组合数学，这涵盖了该团队所需要的所有数学基础。另外三分之一的研究员，包括刘铁岩在内都是来自电子工程专业，刘铁岩博士认为，电子工程专业出身的人有一个很大的优点便是有着非常好的直觉，并且不局限自己的思路，十分开放。而其余三分之一的研究员则是计算机专业出身，他们都拥有很强的计算机技能。当数学、电子和计算机三拨精英碰撞在一起的时候，就没有什么研究方向能难得住他们了。<br />此外，人工智能组还是一个十分重视学术和工程实践相结合的团队。他们的很多启发与灵感都来自于与微软产品部门的合作，因此，这是一个不断提出新问题的团队。在人工智能组发表的论文中你可以看到一个很明显的特点：<span style="color: #ff0000">团队很少循规蹈矩地解决别人提出的问题，而是经常提出新的问题，并给出一个力所能及范围内的最优解。这样的论文常常有很高的引用数，平均下来，刘铁岩和他的团队发表的论文几乎每篇都有上百次的引用。</span><br />三次转型带来了如今人工智能研究组的团队凝聚力。一加一大于二，小组的很多论文都有至少一个电子，一个计算机和一个数学背景的研究员参与，这样的论文都非常有特点，也能满足各种要求，无论是定力证明、直觉、还是实现的精巧，都可圈可点。<br />&#8220;争吵文化&#8221;与&#8220;真理不辨不明&#8221; 刘铁岩博士带领的人工智能组还有一个十分有趣的&#8220;争吵文化&#8221;。在接受采访时，笔者对刘铁岩博士嘴里说出的&#8220;争吵文化&#8221;感到十分难以置信。坐在对面的刘铁岩博士穿着经典款的男士衬衫，外套一件淡灰色的羊毛开衫，学院气息浓厚，让人似乎很难将他与&#8220;争吵&#8221;联系在一起。<br />&#8220;我们团队几乎会天天争吵。&#8221;刘铁岩博士笑言。但这其实是研究组最有活力的状态，开会的时候，大家不会在乎职位高低，就一个问题会针锋相对地表达自己的观点。人工智能组全组上下都坚持的一个信条是&#8220;真理不辨不明&#8221;。在刘铁岩的带领下，整个组会相互批判的看问题，就连待久一点的实习生也会自然的融入其中，和他的导师间也是一种互相辩论，互相学习的关系。<br />因此，对于实习生来说，进入微软亚洲研究院会带来巨大的成长。首先是知识的积累，很多实习生在进研究院之初知识非常有限。但微软亚洲研究院计算机专家资源密集，超过两百名的计算机专家们的研究经历、方向和视角各不相同，向他们学习一定会有所收获。其次，实习生们在这里学会的更多是研究经验和研究方法，&#8220;争吵文化&#8221;在这里便得到了很好的体现。无论是什么大牛发了什么论文，都应该抱有一种&#8220;破坏性&#8221;的思想，先客观地分析，从中立甚至批判的视角来研究。因此，人工智能组培养出的实习生也都个性十足，颇有&#8220;小牛&#8221;风范，从不盲目崇拜。<br />在微软亚洲研究院大学：成长于中国，却能影响世界 作为三清毕业的博士、微软亚洲研究院首席研究员，刘铁岩博士的研究之路始终都未离开中国本土。而作为国际机器学习和信息检索领域的知名学者，他的国际影响力也毋庸置疑。刘铁岩的论文多次获得最佳论文奖、最高引用论文奖；他担任了SIGIR、WWW、NIPS、AAAI等众多顶级学术会议的程序委员会主席或领域主席，ACM信息系统会刊（TOIS）、ACM万维网会刊（TWEB）等主流学术期刊的副主编；他和他的研究成果也被美国国家公共电台、中国中央电视台、MIT技术评论等国内外知名媒体所报道。此外，他还受邀在包括卡内基梅隆大学（CMU）、诺丁汉大学在内的国内外知名高校担任客座教授、博士生导师。对于所获得的诸多成就，刘铁岩无不感动地说：&#8220;最重要的原因其实是我来自微软亚洲研究院，如果我博士毕业没有来到研究院，我都不敢想象会有今天的影响力。&#8221;<br />微软亚洲研究院从1998年11月成立的第一天开始，就在国际学术界扮演着举足轻重的作用。这么多年来，研究院以一贯开放的心态，与学术界展开积极的合作，而研究院开放的学术环境也为研究人员们构建了一座与学术界的桥梁，两者相辅相成。甚至有国外学者戏称微软亚洲研究院是一个让人&#8220;又爱又恨&#8221;的机构。爱在它的研究成果，为学术界带来了诸多创新，也&#8220;恨&#8221;在其彪悍的实力，让别人望尘莫及。<br />除了学术合作，微软亚洲研究院为研究员们还提供了接触用户，服务用户的可能。微软亚洲研究院的研究员也和微软的产品部分积极展开合作。刘铁岩博士带领的人工智能组的技术转化也体现在微软必应搜索的搜索结果排序和广告排序，小冰的自动问答技术等微软的产品和服务中。<br />刘铁岩博士谦虚的表示，&#8220;能成为包括CMU在内的众多知名高校的客座教授，很大程度源于学术界对微软亚洲研究院的信任。甚至人工智能组的实习生，也成为了CMU的offer收割机，这都得益于我们开放的科研环境和紧密的学术交流。因为微软亚洲研究院，我们的研究被更多人关注，我们的新人也被更多人认可，这就形成了一个良性循环。类似于国外的师承关系，从这个角度来看，微软亚洲研究院着实就像是一所大学了。&#8221;<br />
<div>阅读记录:read twice</div></div></div></div></div><img src ="http://www.cppblog.com/guijie/aggbug/212701.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2016-01-21 10:05 <a href="http://www.cppblog.com/guijie/archive/2016/01/21/212701.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>机器之心： 干货：七步打造深度学习专</title><link>http://www.cppblog.com/guijie/archive/2015/12/18/212511.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Fri, 18 Dec 2015 12:11:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2015/12/18/212511.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/212511.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2015/12/18/212511.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/212511.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/212511.html</trackback:ping><description><![CDATA[<a href="http://www.almosthuman.cn/2015/12/16/rhnvh/" target="_blank">http://www.almosthuman.cn/2015/12/16/rhnvh/</a><br /><br /><br />
<blockquote>
<p style="text-align: left; line-height: 21px">本文作者<font style="color: rgb(150,30,35)"><a href="https://www.linkedin.com/pub/ankit-agarwal/7a/b68/620?trk=pulse-det-athr_prof-art_hdr" target="_blank">Ankit Agarwal</a></font>是面向开发者的神经网络平台提供商Silversparro Technologies的CTO和创始人。</p></blockquote>
<p style="text-align: left; line-height: 24px">1，第一步，了解什么是机器学习，最佳入门资源就是 <font style="color: rgb(150,30,35)"><a href="https://www.coursera.org/learn/machine-learning" target="_blank">Andrew Ngs (Ex-Google, Stanford, Baidu), an online course at coursera</a></font>. 讲座让你足够了解机器学习的基础，不过课后作业会提升你对机器学习的了解。</p>
<p style="text-align: left; line-height: 24px">2，接下来需要培养对神经网络的直觉。所以，继续编写你的第一个神经网络，和它玩耍吧</p>
<p style="text-align: left; line-height: 24px">3，了解神经网络很重要，但是简单神经网络没有足够能力解决大多数有趣问题。变量-卷积神经网络很善于解决视觉问题。斯坦福课程笔记以及幻灯片：<font style="color: rgb(150,30,35)"><a href="http://cs231n.github.io/" target="_blank">CS231n Convolutional Neural Networks for Visual Recognition</a></font>(notes), 和<font style="color: rgb(150,30,35)"><a href="http://cs231n.stanford.edu/syllabus.html" target="_blank">CS231n: Convolutional Neural Networks for Visual Recognition</a></font> (讲座幻灯片)。 <font style="color: rgb(150,30,35)"><a href="http://techtalks.tv/talks/lecture-part-2-2/59478/" target="_blank">here</a></font>和 <font style="color: rgb(150,30,35)"><a href="https://www.youtube.com/watch?v=bEUX_56Lojc" target="_blank">here</a></font>是两个很棒的有关CNNs的视频。</p>
<p style="text-align: left; line-height: 24px">4，接下来就是自己电脑上运行你的第一个CNN</p>
<ul><li>买 <font style="color: rgb(150,30,35)"><a href="http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970" target="_blank">GPU</a></font> 和安装 <font style="color: rgb(150,30,35)"><a href="https://developer.nvidia.com/cuda-toolkit" target="_blank">CUDA</a></font><br /></li></ul>
<ul><li>安装 <font style="color: rgb(150,30,35)"><a href="http://caffe.berkeleyvision.org/" target="_blank">Caffe</a></font> 及 <font style="color: rgb(150,30,35)"><a href="https://developer.nvidia.com/digits" target="_blank">Digit</a></font><br /></li></ul>
<ul><li>安装 <font style="color: rgb(150,30,35)"><a href="http://boinc.berkeley.edu/" target="_blank">Boinc</a></font>（这个对你的学习没帮助，但是能让其他研究人员在在它闲置的时候使用你的GPU从科学工作）<br /></li></ul>
<p style="text-align: left; line-height: 24px">5，Digit提供上会给少数几个算法，比如 用来性格识别的<font style="color: rgb(150,30,35)"><a href="http://yann.lecun.com/exdb/lenet/" target="_blank">Lenet</a></font> ，图像分类的 <font style="color: rgb(150,30,35)"><a href="https://github.com/BVLC/caffe/tree/master/models/bvlc_googlenet" target="_blank">Googlenet</a></font>。你要下载 相关数据库（ <font style="color: rgb(150,30,35)"><a href="http://yann.lecun.com/exdb/mnist/" target="_blank">dataset for Lenet</a></font> 和 <font style="color: rgb(150,30,35)"><a href="http://www.image-net.org/" target="_blank">dataset for Googlenet</a></font> ）来运行这些算法。可以修改算法并尝试其他有趣的视觉图像识别任务，就像我们尝试过的（ <font style="color: rgb(150,30,35)"><a href="https://www.linkedin.com/pulse/deep-learning-fun-crazy-food-image-classifier-abhinav-kumar-gupta?trk=prof-post" target="_blank">here</a></font>）。</p>
<p style="text-align: left; line-height: 24px">6，就各种NLP任务而言，RNNs是最佳选择。学习RNNs最好的地方是斯坦福的演讲视频（<font style="color: rgb(150,30,35)"><a href="http://cs224d.stanford.edu/syllabus.html" target="_blank">Stanford lecture videos here</a></font>）。你可以下载 <font style="color: rgb(150,30,35)"><a href="https://www.tensorflow.org/" target="_blank">Tensorflow</a></font>，用它来建造RNNs.</p>
<p style="text-align: left; line-height: 24px">7，现在，继续选择一个深度学习问题吧，无论是面部识别还是语音识别、无人驾驶汽车等等，试着解决它。</p>
<p style="text-align: left; line-height: 24px">如果你完成了所有步骤，恭喜！去申请谷歌、百度、微软、脸书或者亚马逊的职位吧。没多少人能做这些。</p><br />
<p style="text-align: left; line-height: 24px"><strong>来自<font style="color: rgb(150,30,35)"><a href="https://www.linkedin.com/pulse/7-steps-becoming-deep-learning-expert-ankit-agarwal" target="_blank">linkedin</a></font>，机器之心编译出品。编译：微胖。</strong></p><img src ="http://www.cppblog.com/guijie/aggbug/212511.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2015-12-18 20:11 <a href="http://www.cppblog.com/guijie/archive/2015/12/18/212511.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>刷新神经网络新深度：ImageNet计算机视觉挑战赛微软中国研究员夺冠</title><link>http://www.cppblog.com/guijie/archive/2015/12/11/212464.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Fri, 11 Dec 2015 14:28:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2015/12/11/212464.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/212464.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2015/12/11/212464.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/212464.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/212464.html</trackback:ping><description><![CDATA[<a href="http://mp.weixin.qq.com/s?__biz=MzAwMTA3MzM4Nw==&amp;mid=400607098&amp;idx=1&amp;sn=933c7328221cfec90e358314be8602e3&amp;scene=1&amp;srcid=1211pUOOAQdspFZkl74STys9&amp;from=groupmessage&amp;isappinstalled=0#wechat_redirect">http://mp.weixin.qq.com/s?__biz=MzAwMTA3MzM4Nw==&amp;mid=400607098&amp;idx=1&amp;sn=933c7328221cfec90e358314be8602e3&amp;scene=1&amp;srcid=1211pUOOAQdspFZkl74STys9&amp;from=groupmessage&amp;isappinstalled=0#wechat_redirect</a><br />
<p>世界上最好计算机视觉系统有多精确？就在美国东部时间12月10日上午9时，ImageNet计算机视觉识别挑战赛结果揭晓&#8212;&#8212;微软亚洲研究院视觉计算组的研究员们凭借深层神经网络技术的最新突破，以绝对优势获得图像分类、图像定位以及图像检测全部<span style="color: rgb(61,170,214)"><strong>三个主要项目的冠军</strong></span>。同一时刻，他们在另一项图像识别挑战赛MS COCO（Microsoft Common Objects in Context，常见物体图像识别）中<span style="color: rgb(61,170,214)"><strong>同样成功登顶</strong></span>，在图像检测和图像分割项目上击败了来自学界、企业和研究机构的众多参赛者。</p>
<p><br /></p>
<p>ImageNet计算机视觉挑战赛由来自全球顶尖高校和公司的研究员组织举办，近年来已经成为计算机视觉领域的标杆，其比赛结果总能十分直观地反映出计算机视觉这一热门领域中各研究机构的研究进展和突破。MS COCO数据库是由微软资助建立，其挑战赛目前由学术界几所高校联合组织，独立运行。</p>
<p><br /></p>
<p>这两个挑战赛的侧重点各有不同：ImageNet 倾向于评测识别图像中显著物体的能力，而MS COCO倾向于评测识别复杂场景的各类物体的能力。能同时在两个世界级的比赛中获得冠军，足以说明研究组的技术突破是通用的&#8212;&#8212;它可以显著地改善计算机视觉领域的各项研究，甚至计算机视觉领域以外的研究，比如语音识别。那么究竟是什么样的技术突破？</p>
<p><br /></p>
<p>在计算机视觉领域，深层神经网络的方法常常被研究人员用来训练计算机识别物体，微软也不例外。但微软亚洲研究院的研究员们在此次ImageNet挑战赛中<span style="color: red">使用了一种前所未有，深度高达百层的神经网络。该网络的层数比以往任何成功使用的神经网络的层数多</span><span style="color: red"><strong>5倍</strong></span><span style="color: red">以上。</span></p>
<p><br /></p>
<p>要实现这一技术，背后的挑战巨大。<span style="color: red">起初，连研究员们自己都不确信训练非常深的网络是可能或有用的。&#8220;我们没想到这样一个简单的想法意义却如此重大。&#8221; 微软亚洲研究院首席研究员孙剑坦言。完成这项技术突破的团队由</span><span style="color: red"><strong>4位</strong></span><span style="color: red">中国研究员组成</span>：孙剑与何恺明来自微软亚洲研究院视觉计算组，另外两人为微软亚洲研究院的联合培养博士生，分别是来自西安交通大学的张祥雨和中国科学技术大学的任少卿。</p>
<p><br /></p>
<p style="text-align: center"><img style="height: auto !important; visibility: visible !important; width: auto !important" alt="" src="http://mmbiz.qpic.cn/mmbiz/HkPvwCuFwNPV00EnD9Ho03u14QtibtCug8h9bF0TGJianKfI9fGN42ODaRk8RwUFUJiasEfic5Z8pYdIf5ypkibq6eA/640?wx_fmt=jpeg&amp;wxfrom=5&amp;wx_lazy=1" data-src="http://mmbiz.qpic.cn/mmbiz/HkPvwCuFwNPV00EnD9Ho03u14QtibtCug8h9bF0TGJianKfI9fGN42ODaRk8RwUFUJiasEfic5Z8pYdIf5ypkibq6eA/640?wx_fmt=jpeg&amp;wxfrom=5&amp;wx_lazy=1" data-w="" data-ratio="0.7985611510791367" data-type="jpeg" data-s="300,640" /><br /><span style="font-size: 11px">微软亚洲研究院主管研究员<span style="line-height: 25px">何恺明</span></span></p>
<p><br /></p>
<p>当然，这个重大的技术突破震惊的不仅仅是这个研究团队的研究员们。<span style="color: red">微软全球资深副总裁Peter Lee表示，&#8220;从某种意义上说，他们完全颠覆了我之前对深层神经网络的设想。&#8221;</span></p>
<p><br /></p>
<p>ImageNet挑战赛去年获胜的系统错误率为6.6%，而今年微软系统的错误率已经低至<strong><span style="color: rgb(61,170,214)">3.57%</span></strong>。事实上，该研究团队早在今年一月就首次实现了对人类视觉能力的突破。当时，在题为&#8220;Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification&#8221;的论文中，他们系统的错误率已降低至4.94%。此前同样的实验中，人眼辨识的错误率大概为5.1%。</p>
<p><br /></p>
<p><br /></p>
<p><span style="font-size: 24px"><strong>滴水穿石：这是一个关于耐心与创新的故事</strong></span></p>
<p><br /></p>
<p>近几十年来，科学家们一直都在训练计算机做各种各样的事情, 例如图像或语音识别。但很长一段时间内，这些系统的误差巨大，难以消弭。</p>
<p><br /></p>
<p>大约在五年前，研究人员们开始重新使用 &#8220;神经网络&#8221;的技术并使其再次焕发出新的活力。神经网络的复兴让图像和语音识别等技术的精度实现了大幅度飞跃。微软的SkypeTranslator实时语音翻译技术就得益于此，它能够更好地识别语音，从而不断完善机器翻译的准确性。</p>
<p><br /></p>
<p>类似于人脑，神经网络包含多级非线性处理层。从理论上说，越多的层级应该能带来越好的学习结果。但实际实验中的最大挑战是，在通过每一层级的反传训练中，反穿监督信号幅度会迅速衰减，这让整个神经网络系统的训练极为困难。</p>
<p><br /></p>
<p>孙剑回忆到：&#8220;三年前，当计算机视觉和机器实际领域训练出8层的深层神经网络系统时，识别精度有了质的飞跃。去年出现了足有20到30层的深层神经网络，识别精度又被大幅刷新。&#8221;</p>
<p><br /></p>
<p>孙剑和他的组员们认为网络还可以更深。过去的几个月来，他们用各种方式来添加更多的层级，同时还要保证结果的准确性。他们经历了大量错误的尝试，也吸取了很多的经验教训。最后，一<span style="color: red">个被他们称之为&#8220;深层残差网络（deep residual networks）&#8221;的系统在微软亚洲研究院成功诞生。</span></p>
<p><br /></p>
<p><span style="color: red">这个&#8220;深层残差网络&#8221;正是他们用于ImageNet挑战赛的系统，它实现了惊人的152层</span>，比以往世界范围内的任何系统都深5倍以上。它还使用了一个全新的&#8220;残差学习&#8221;原则来指导神经网络结构的设计。残差学习最重要的突破在于重构了学习的过程，并重新定向了深层神经网络中的信息流。残差学习很好地解决了此前深层神经网络层级与准确度之间的矛盾。</p>
<p><br /></p>
<p><br /></p>
<p><strong><span style="font-size: 24px">借水行舟：从科研探索到智能产品</span></strong></p>
<p><br /></p>
<p>神经网络有一个非常重要的优点，就是学习到的内部表示或特征可以在不同任务中复用。Skype Translator就是一个很好的例子，英语与德语之间的翻译准确率可以随着英语与中文翻译的不断增加而提高。</p>
<p><br /></p>
<p>孙剑表示，他们的深层残差网络具有非常强的通用性。他们把该系统用于ImageNet挑战赛的分类任务后，他们发现这一系统学到的内部表示或特征能显著提高其它三项任务：检测（detection），定位（localization）和分割（segmentation）。&#8220;从我们极深的深层神经网络中可以看出，深层残差网络力量强大且极为通用，可以预见它还能极大地改善其它计算机视觉问题。&#8221;</p>
<p><br /></p>
<p>事实上，孙剑团队多年来在计算机视觉领域的研究成果已经转化到众多微软的智能产品和服务中，例如，微软牛津计划中的人脸识别和图像识别API、Windows 10中的Windows Hello&#8220;刷脸&#8221;开机功能、必应的图像搜索、微软小冰的多个图像&#8220;技能&#8221;，OneDrive中的图片分类功能，以及广受好评的口袋扫描仪Office Lens等等，不胜枚举。</p>
<p><br /></p>
<p>以微软牛津计划为例，该计划开放了一系列机器学习相关的API，让没有机器学习背景的开发人员也能构建自己的智能应用。而其中人脸识别API作为牛津计划最先开放的API，受到广泛使用。此前火遍全球的<a href="http://mp.weixin.qq.com/s?__biz=MzAwMTA3MzM4Nw==&amp;mid=204586188&amp;idx=1&amp;sn=86a7be49121c0636691f7e3748d81e79&amp;scene=21#wechat_redirect" target="_blank" data_ue_src="http://mp.weixin.qq.com/s?__biz=MzAwMTA3MzM4Nw==&amp;mid=204586188&amp;idx=1&amp;sn=86a7be49121c0636691f7e3748d81e79&amp;scene=21#wechat_redirect">How-old.net（微软颜龄机器人）</a>和<a href="http://mp.weixin.qq.com/s?__biz=MzAwMTA3MzM4Nw==&amp;mid=206096976&amp;idx=1&amp;sn=1ec388667f71fff81a7aa3e325c2b7b2&amp;scene=21#wechat_redirect" target="_blank" data_ue_src="http://mp.weixin.qq.com/s?__biz=MzAwMTA3MzM4Nw==&amp;mid=206096976&amp;idx=1&amp;sn=1ec388667f71fff81a7aa3e325c2b7b2&amp;scene=21#wechat_redirect">Twins or Not（微软我们）</a>就是在人脸识别API基础上，<span style="color: red">通过几行简单的代码实现的。</span></p>
<p><br /></p>
<p>通过和微软产品部门的紧密合作，这些来自于微软亚洲研究院的全球领先的计算机视觉技术得以应用在几亿人的生活中。而这些来自中国研究员的研究成果，正在为我们的生活带来一场&#8220;隐形革命&#8221;，为全球用户提供更智能的生产力工具和更个性化的计算体验。</p>
<p><br /></p>
<p>微软全球资深副总裁、微软亚洲研究院院长洪小文博士表示，&#8220;与视觉在人类感官中的重要性相同，计算机视觉的一次次重大突破无疑为人工智能的整体发展提供了强大动力。让计算机看懂多彩的世界，一直是激励微软研究院及计算机领域同仁在这条充满挑战的道路上前行的重要力量。未来，还有更多突破等着我们去挑战！&#8221;</p>
<p><br /></p>
<p>&#8220;微软亚洲研究院成立17年了，她的研究环境和气氛为中国IT届培养了众多的人才; 我在这里工作了12年，静下心来你就能在这样的环境中收获激动人心的发现。今天，我对我的团队说，请享受一天获得NBA冠军的感觉！&#8221;<span style="line-height: 25px">孙剑说。<br />阅读记录:read twice</span></p><img src ="http://www.cppblog.com/guijie/aggbug/212464.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2015-12-11 22:28 <a href="http://www.cppblog.com/guijie/archive/2015/12/11/212464.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>Softmax-Loss</title><link>http://www.cppblog.com/guijie/archive/2015/12/10/212451.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Thu, 10 Dec 2015 09:20:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2015/12/10/212451.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/212451.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2015/12/10/212451.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/212451.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/212451.html</trackback:ping><description><![CDATA[&nbsp;&nbsp;&nbsp;&nbsp; 摘要: After discussing with Chong Wang of IIM, I have understood completely http://deeplearning.stanford.edu/wiki/index.php/Softmax_Regression. This is a good reference. The matlab code can be found in my c...&nbsp;&nbsp;<a href='http://www.cppblog.com/guijie/archive/2015/12/10/212451.html'>阅读全文</a><img src ="http://www.cppblog.com/guijie/aggbug/212451.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2015-12-10 17:20 <a href="http://www.cppblog.com/guijie/archive/2015/12/10/212451.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>"见附件"用英语怎么说</title><link>http://www.cppblog.com/guijie/archive/2015/12/03/212401.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Thu, 03 Dec 2015 07:27:00 GMT</pubDate><guid>http://www.cppblog.com/guijie/archive/2015/12/03/212401.html</guid><wfw:comment>http://www.cppblog.com/guijie/comments/212401.html</wfw:comment><comments>http://www.cppblog.com/guijie/archive/2015/12/03/212401.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.cppblog.com/guijie/comments/commentRss/212401.html</wfw:commentRss><trackback:ping>http://www.cppblog.com/guijie/services/trackbacks/212401.html</trackback:ping><description><![CDATA[<div style="display: inline-block;"><div>2015年12月02日</div></div>美国教授给我126邮箱回复: See attachment. &nbsp;<div style="display: inline-block;">2016年12月11日</div>美国教授给我126邮箱回复: See attached. 20170429澳洲一老师给我126邮箱回复:&nbsp;<span style="font-family: verdana, sans-serif;">Please see the attached for the writing of my previous promotion application regarding ...</span><div style="display: inline-block;">2017年4月1日</div>美国教授给我126邮箱回复: My CV is attached. 20170429一trans主编126邮箱回复:&nbsp;<span style="font-family: Calibri; font-size: 14.6667px;">Attached please find the slides.</span><img src ="http://www.cppblog.com/guijie/aggbug/212401.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.cppblog.com/guijie/" target="_blank">杰哥</a> 2015-12-03 15:27 <a href="http://www.cppblog.com/guijie/archive/2015/12/03/212401.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item></channel></rss>