<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/">
    <channel>
        <title>NotionNext BLOG</title>
        <link>https://notion-next-blue-seven.vercel.app/</link>
        <description>这是一个由NotionNext生成的站点</description>
        <lastBuildDate>Mon, 10 Jul 2023 02:20:44 GMT</lastBuildDate>
        <docs>https://validator.w3.org/feed/docs/rss2.html</docs>
        <generator>https://github.com/jpmonette/feed</generator>
        <language>zh-CN</language>
        <copyright>All rights reserved 2023, Hjsz</copyright>
        <item>
            <title><![CDATA[malicious traffic or network attack detection]]></title>
            <link>https://notion-next-blue-seven.vercel.app/article/ec30a721-7e60-4f59-8b48-27db1351f8f2</link>
            <guid>https://notion-next-blue-seven.vercel.app/article/ec30a721-7e60-4f59-8b48-27db1351f8f2</guid>
            <pubDate>Sat, 06 May 2023 00:00:00 GMT</pubDate>
            <description><![CDATA[一些论文和代码的记录]]></description>
            <content:encoded><![CDATA[<div id="container" class="mx-auto undefined"><main class="notion light-mode notion-page notion-block-ec30a7217e604f598b4827db1351f8f2"><div class="notion-viewport"></div><div class="notion-collection-page-properties"></div><div class="notion-table-of-contents notion-gray notion-block-e054318dd3c349508cb8ef677a749acd"><a href="#fbe85df70bfc4aebae7d411d7b626b2d" class="notion-table-of-contents-item"><span class="notion-table-of-contents-item-body" style="display:inline-block;margin-left:0">Code</span></a><a href="#94ea24dba0b54393ae37906f26a0c764" class="notion-table-of-contents-item"><span class="notion-table-of-contents-item-body" style="display:inline-block;margin-left:0">Paper</span></a><a href="#841f8da2f3ee46288ed460768027ba44" class="notion-table-of-contents-item"><span class="notion-table-of-contents-item-body" style="display:inline-block;margin-left:0">流量分析工具</span></a><a href="#2538c2451b9f44d0be875d0df6a4fbff" class="notion-table-of-contents-item"><span class="notion-table-of-contents-item-body" style="display:inline-block;margin-left:0">数据集</span></a><a href="#a9defc8f9ebd4a97bfe3c8dc9367dd3d" class="notion-table-of-contents-item"><span class="notion-table-of-contents-item-body" style="display:inline-block;margin-left:24px">数据预处理</span></a><a href="#45b31361a4e148e1bcaa92b823a2ce9d" class="notion-table-of-contents-item"><span class="notion-table-of-contents-item-body" style="display:inline-block;margin-left:24px">神经后门</span></a><a href="#1d388cc6b25e4e04a5a1da5bc1bd9698" class="notion-table-of-contents-item"><span class="notion-table-of-contents-item-body" style="display:inline-block;margin-left:0">暑假论文</span></a></div><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-fbe85df70bfc4aebae7d411d7b626b2d" data-id="fbe85df70bfc4aebae7d411d7b626b2d"><span><div id="fbe85df70bfc4aebae7d411d7b626b2d" class="notion-header-anchor"></div><a class="notion-hash-link" href="#fbe85df70bfc4aebae7d411d7b626b2d" title="Code"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Code</span></span></h2><ul class="notion-list notion-list-disc notion-block-a119cac174a04f2589fe3fb7d87cda51"><li>基于机器学习的恶意流量检查平台，做成了平台</li></ul><a target="_blank" rel="noopener noreferrer" href="https://github.com/iotsecty/malicious_traffic_detection_platform" class="notion-external notion-external-block notion-row notion-block-2555e686166049eca254175d22e5bfa0"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">malicious_traffic_detection_platform</div><div class="notion-external-subtitle"><span>iotsecty</span><span> • </span><span>Updated <!-- -->Jun 27, 2023</span></div></div></a><ul class="notion-list notion-list-disc notion-block-0ffe49cb7e3b426abd3dcf45c462ff4b"><li><b>NetworkTrafficAnalysis 网络流量分析代码</b></li></ul><a target="_blank" rel="noopener noreferrer" href="https://github.com/bobolike123/NetworkTrafficAnalysis" class="notion-external notion-external-block notion-row notion-block-017bd575df68437e9e849326a5c48184"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">NetworkTrafficAnalysis</div><div class="notion-external-subtitle"><span>bobolike123</span><span> • </span><span>Updated <!-- -->Mar 13, 2023</span></div></div></a><ul class="notion-list notion-list-disc notion-block-a424f1929bbe4d48b8315798d2812474"><li><b>物联网-网络-入侵检测-和分类-使用-可解释-XAI-机器学习-算法
</b><a target="_blank" rel="noopener noreferrer" href="https://github.com/harshilpatel1799/IoT-Network-Intrusion-Detection-and-Classification-using-Explainable-XAI-Machine-Learning" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">IoT-Network-Intrusion-Detection-and-Classification-using-Explainable-XAI-Machine-Learning</div><div class="notion-external-subtitle"><span>harshilpatel1799</span><span> • </span><span>Updated <!-- -->Jul 4, 2023</span></div></div></a></li></ul><ul class="notion-list notion-list-disc notion-block-ad661557e3bc4726b0a03d67b38cf8da"><li>物联网
<a target="_blank" rel="noopener noreferrer" href="https://github.com/harshilpatel1799/Iot-Cyber-Security-with-Machine-Learning-Research-Project" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">Iot-Cyber-Security-with-Machine-Learning-Research-Project</div><div class="notion-external-subtitle"><span>harshilpatel1799</span><span> • </span><span>Updated <!-- -->Jun 29, 2023</span></div></div></a></li></ul><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-94ea24dba0b54393ae37906f26a0c764" data-id="94ea24dba0b54393ae37906f26a0c764"><span><div id="94ea24dba0b54393ae37906f26a0c764" class="notion-header-anchor"></div><a class="notion-hash-link" href="#94ea24dba0b54393ae37906f26a0c764" title="Paper"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Paper</span></span></h2><ul class="notion-list notion-list-disc notion-block-6ef8b51a023d430abca98e1f1e8ef2fa"><li>A Deep Multi-Modal Cyber-Attack Detection in
Industrial Control Systems</li><ul class="notion-list notion-list-disc notion-block-6ef8b51a023d430abca98e1f1e8ef2fa"><div class="notion-text notion-block-b19f82b537e84e3faa171969f0dcfdaa"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://paperswithcode.com/paper/a-deep-multi-modal-cyber-attack-detection-in">A Deep Multi-Modal Cyber-Attack Detection in Industrial Control Systems | Papers With Code</a></div></ul></ul><ul class="notion-list notion-list-disc notion-block-c81841c3a407450ab139c1cf3f39aff3"><li>基于集成学习的加密恶意流量检测</li><ul class="notion-list notion-list-disc notion-block-c81841c3a407450ab139c1cf3f39aff3"><div class="notion-text notion-block-bd7be4cc80d84f349342c7ad5f4c30c8"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://link.springer.com/chapter/10.1007/978-3-030-94029-4_1">基于集成学习的加密恶意流量检测 |施普林格链接 (springer.com)</a></div></ul></ul><ul class="notion-list notion-list-disc notion-block-5486fd15127441fd8e73f99edc370445"><li><b><b>一种基于半监督深度学习的网络恶意流量检测方法</b></b></li><ul class="notion-list notion-list-disc notion-block-5486fd15127441fd8e73f99edc370445"><div class="notion-text notion-block-d9ca47bfb3e44214bf8754224df34a6a"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://ieeexplore.ieee.org/document/9564717">一种基于半监督深度学习的网络恶意流量检测方法 |IEEE会议出版物 |IEEE Xplore</a></div></ul></ul><ul class="notion-list notion-list-disc notion-block-faf0a6fa9d764c5c852b023e3a71dff4"><li><b>物联网网络中恶意流量检测的机器学习模型
</b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://link.springer.com/chapter/10.1007/978-3-030-98978-1_5">物联网网络中恶意流量检测的机器学习模型 /IoT-23 数据集/ |施普林格链接 (springer.com)</a></li></ul><ul class="notion-list notion-list-disc notion-block-b8108473951f4391b27832d44f6afe9e"><li><b><b>A Deep Hierarchical Network for Packet-Level Malicious Traffic Detection
</b></b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://ieeexplore.ieee.org/abstract/document/9247978">A Deep Hierarchical Network for Packet-Level Malicious Traffic Detection | IEEE Journals &amp; Magazine | IEEE Xplore</a>
<a target="_blank" rel="noopener noreferrer" href="https://github.com/bobolike123/NetworkTrafficAnalysis" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">NetworkTrafficAnalysis</div><div class="notion-external-subtitle"><span>bobolike123</span><span> • </span><span>Updated <!-- -->Mar 13, 2023</span></div></div></a></li></ul><ul class="notion-list notion-list-disc notion-block-3ac11ddeb5e34adca420729bade0d5e8"><li><b><b>Machine Learning for Encrypted Malicious Traffic Detection: Approaches, Datasets and Comparative Study
</b></b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2203.09332">[2203.09332] Machine Learning for Encrypted Malicious Traffic Detection: Approaches, Datasets and Comparative Study (arxiv.org)</a>
</li></ul><ul class="notion-list notion-list-disc notion-block-e9ea77b80f3442f8add6b5481dc712c9"><li>An Input-Agnostic Hierarchical Deep Learning Framework for Traffic Fingerprinting</li></ul><ul class="notion-list notion-list-disc notion-block-5349316a234d4ffda7d7635939d95c1b"><li>CCS 2022 
<b><b>Exposing the Rat in the Tunnel: Using Traffic Analysis for Tor-based Malware Detection
</b></b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://dl.acm.org/doi/10.1145/3548606.3560604">Exposing the Rat in the Tunnel | Proceedings of the 2022 ACM SIGSAC Conference on Computer and Communications Security</a>
<a target="_blank" rel="noopener noreferrer" href="https://github.com/malfp/tormalwarefp" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">tormalwarefp</div><div class="notion-external-subtitle"><span>malfp</span><span> • </span><span>Updated <!-- -->May 25, 2023</span></div></div></a></li></ul><ul class="notion-list notion-list-disc notion-block-7d1bc1857aff4dec9de326ea76cd0ac6"><li><b><b>New Directions in Automated Traffic Analysis（自动化流量分析 CCS21）
</b></b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://dl.acm.org/doi/10.1145/3460120.3484758">New Directions in Automated Traffic Analysis | Proceedings of the 2021 ACM SIGSAC Conference on Computer and Communications Security</a>
</li></ul><ul class="notion-list notion-list-disc notion-block-1c6d3dc9d49a4e6eb04131fa726ac9dc"><li>S&amp;P 2022 2021 2020 没有相关论文</li></ul><ul class="notion-list notion-list-disc notion-block-769845255bd94945b8adc45881760890"><li>NDSS 2023
<b><b>BARS: Local Robustness Certification for Deep Learning based Traffic Analysis Systems</b></b></li><ul class="notion-list notion-list-disc notion-block-769845255bd94945b8adc45881760890"><div class="notion-text notion-block-ca38afe3a23549dc99ad0800cb0010d6"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.ndss-symposium.org/ndss-paper/bars-local-robustness-certification-for-deep-learning-based-traffic-analysis-systems/">BARS: Local Robustness Certification for Deep Learning based Traffic Analysis Systems - NDSS Symposium (ndss-symposium.org)</a>
<a target="_blank" rel="noopener noreferrer" href="https://github.com/KaiWangGitHub/BARS" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">BARS</div><div class="notion-external-subtitle"><span>KaiWangGitHub</span><span> • </span><span>Updated <!-- -->May 3, 2023</span></div></div></a></div></ul></ul><ul class="notion-list notion-list-disc notion-block-df21f24ef0874f33baa7c92a56d74c45"><li>NDSS 2021 
<b><b>FlowLens: Enabling Efficient Flow Classification for ML-based Network Security Applications
</b></b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.ndss-symposium.org/ndss-paper/flowlens-enabling-efficient-flow-classification-for-ml-based-network-security-applications/">FlowLens: Enabling Efficient Flow Classification for ML-based Network Security Applications - NDSS Symposium (ndss-symposium.org)</a>
<a target="_blank" rel="noopener noreferrer" href="https://github.com/dmbb/FlowLens" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">FlowLens</div><div class="notion-external-subtitle"><span>dmbb</span><span> • </span><span>Updated <!-- -->Jun 21, 2023</span></div></div></a></li></ul><ul class="notion-list notion-list-disc notion-block-8465f9dafaf040bbb835dba3c83e7fc3"><li>NDSS 2020 
<b><b>Encrypted DNS &amp;#8211;&gt; Privacy? A Traffic Analysis Perspective
</b></b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.ndss-symposium.org/ndss-paper/encrypted-dns-privacy-a-traffic-analysis-perspective/">Encrypted DNS -&gt; Privacy? A Traffic Analysis Perspective - NDSS Symposium (ndss-symposium.org)</a>
</li></ul><ul class="notion-list notion-list-disc notion-block-6014ea76bc704b1982f39da815b36c28"><li>NDSS 2020 
<b><b>Practical Traffic Analysis Attacks on Secure Messaging Applications
</b></b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.ndss-symposium.org/ndss-paper/practical-traffic-analysis-attacks-on-secure-messaging-applications/">Practical Traffic Analysis Attacks on Secure Messaging Applications - NDSS Symposium (ndss-symposium.org)</a></li></ul><div class="notion-blank notion-block-4d284cc8544a480090589a63ab0f778b"> </div><ul class="notion-list notion-list-disc notion-block-5b8071ed534843bea7e1650825df231f"><li>USENIX 2022 
<b><b>Automated Detection of Automated Traffic</b></b></li><ul class="notion-list notion-list-disc notion-block-5b8071ed534843bea7e1650825df231f"><div class="notion-text notion-block-fec68b14849f47e38bc2bdce20211746"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.usenix.org/conference/usenixsecurity22/presentation/herley">Automated Detection of Automated Traffic | USENIX</a></div></ul></ul><div class="notion-blank notion-block-f2713ef13b60484ebaf0be2f71948980"> </div><ul class="notion-list notion-list-disc notion-block-7b5bc260fdae40669dd995e5076605b4"><li><b><b>Adaptive Clustering-based Malicious Traffic Classification at the Network Edge Infocom 2021
</b></b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://ieeexplore.ieee.org/abstract/document/9488690">Adaptive Clustering-based Malicious Traffic Classification at the Network Edge | IEEE Conference Publication | IEEE Xplore</a>
<a target="_blank" rel="noopener noreferrer" href="https://github.com/Mobile-Intelligence-Lab/ACID" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">ACID</div><div class="notion-external-subtitle"><span>Mobile-Intelligence-Lab</span><span> • </span><span>Updated <!-- -->May 23, 2023</span></div></div></a></li></ul><ul class="notion-list notion-list-disc notion-block-f6807fed8b4b441198b9662dd569fba4"><li><b><b>Poisoning Attacks on Deep Learning based Wireless Traffic Prediction INFOCOM 2022
</b></b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://ieeexplore.ieee.org/document/9796791">Poisoning Attacks on Deep Learning based Wireless Traffic Prediction | IEEE Conference Publication | IEEE Xplore</a></li></ul><a target="_blank" rel="noopener noreferrer" href="https://github.com/iQua/poisoning-attacks-wireless-traffic-prediction" class="notion-external notion-external-block notion-row notion-block-f20ab31d7e3746beb3239c4506e8a543"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">poisoning-attacks-wireless-traffic-prediction</div><div class="notion-external-subtitle"><span>iQua</span><span> • </span><span>Updated <!-- -->Jan 11, 2023</span></div></div></a><ul class="notion-list notion-list-disc notion-block-d81ca699669c476fad72ddffcff2ec21"><li><b><b>TrojanFlow: A Neural Backdoor Attack to Deep Learning-based Network Traffic Classifiers INFOCOM 2022
</b></b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://ieeexplore.ieee.org/document/9796878">TrojanFlow: A Neural Backdoor Attack to Deep Learning-based Network Traffic Classifiers | IEEE Conference Publication | IEEE Xplore</a>
<a class="notion-link" href="/01b66b787d3c455196e6ec4ca185e7db"><span class="notion-page-title"><div class="notion-page-icon-inline notion-page-icon-span"><span class="notion-page-title-icon notion-page-icon" role="img" aria-label="🗒️">🗒️</span></div><span class="notion-page-title-text">TrojanFlow: A Neural Backdoor Attack to Deep Learning-based Network Traffic Classifiers</span></span></a> 
<a target="_blank" rel="noopener noreferrer" href="https://github.com/PurduePAML/TrojanNN" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">TrojanNN</div><div class="notion-external-subtitle"><span>PurduePAML</span><span> • </span><span>Updated <!-- -->Jul 7, 2023</span></div></div></a>（其他论文关于神经后门的Code).</li></ul><ul class="notion-list notion-list-disc notion-block-a4c549112832476399089cc802c68141"><li><b><b>SoK: Pragmatic Assessment of Machine Learning for Network Intrusion Detection（关于机器学习用于网络入侵检测的研究）
</b></b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2305.00550">[2305.00550] SoK: Pragmatic Assessment of Machine Learning for Network Intrusion Detection (arxiv.org)</a></li></ul><ul class="notion-list notion-list-disc notion-block-94eb160825f34ccf95b9849830129ed3"><li><b><b>FlowTransformer: A Transformer Framework for Flow-based Network Intrusion Detection Systems
</b></b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2304.14746">[2304.14746] FlowTransformer: A Transformer Framework for Flow-based Network Intrusion Detection Systems (arxiv.org)</a>
</li></ul><ul class="notion-list notion-list-disc notion-block-07ddcf7ff550438badf27adce4a3d79a"><li>NDSS 2018 
<b>Kitsune: An Ensemble of Autoencoders for Online Network Intrusion Detection
运用自编码无监督入侵检测识别
</b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://readpaper.com/pdf-annotate/note?pdfId=4558112306987278337&amp;noteId=1782072264054914048">Kitsune: An Ensemble of Autoencoders for Online Network Intrusion Detection. (readpaper.com)</a>

<b>Code</b>
<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/ymirsky/Kitsune-py">ymirsky/Kitsune-py: A network intrusion detection system based on incremental statistics (AfterImage) and an ensemble of autoencoders (KitNET) (github.com)</a></li></ul><ul class="notion-list notion-list-disc notion-block-a8ba530019a14aea816ac9a3803611ca"><li>NDSS 2018
<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://docs.lib.purdue.edu/cgi/viewcontent.cgi?article=2782&amp;context=cstech">Trojaning Attack on Neural Networks (purdue.edu)</a>
<a target="_blank" rel="noopener noreferrer" href="https://github.com/PurduePAML/TrojanNN" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">TrojanNN</div><div class="notion-external-subtitle"><span>PurduePAML</span><span> • </span><span>Updated <!-- -->Jul 7, 2023</span></div></div></a>
INFOCOM 2022后门攻击的原始代码</li></ul><ul class="notion-list notion-list-disc notion-block-ea7859a5634347909ccade4e877e83b5"><li>CVPR 2023 神经网络后门
<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://openaccess.thecvf.com/content/CVPR2023/html/Bober-Irizar_Architectural_Backdoors_in_Neural_Networks_CVPR_2023_paper.html">CVPR 2023 Open Access Repository (thecvf.com)</a>
<a target="_blank" rel="noopener noreferrer" href="https://github.com/QuangNguyen2609/ARCHITECTURAL-BACKDOORS-IN-NEURAL-NETWORKS" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">ARCHITECTURAL-BACKDOORS-IN-NEURAL-NETWORKS</div><div class="notion-external-subtitle"><span>QuangNguyen2609</span><span> • </span><span>Updated <!-- -->Apr 15, 2023</span></div></div></a></li></ul><ul class="notion-list notion-list-disc notion-block-10556e2095514ca6a8146e5ae9467cbc"><li><a target="_blank" rel="noopener noreferrer" href="https://github.com/iZRJ/Federated-Learning-Based-Intrusion-Detection-System" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">Federated-Learning-Based-Intrusion-Detection-System</div><div class="notion-external-subtitle"><span>iZRJ</span><span> • </span><span>Updated <!-- -->Jun 20, 2023</span></div></div></a>
基于联邦学习的入侵检测系统，用的最基本的FedAvg，可以作为code基础</li></ul><ul class="notion-list notion-list-disc notion-block-fba165d10d934e62964b2f42b97e232d"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://cs.paperswithcode.com/paper/generalizing-intrusion-detection-for">Generalizing intrusion detection for heterogeneous networks: A stacked-unsupervised federated learning approach | Papers With Code</a>
<b>异构网络的泛化入侵检测：一种堆叠无监督的联邦学习方法，考虑到了FL中的异构问题</b></li></ul><ul class="notion-list notion-list-disc notion-block-c9348dfec655447795826d1844af0c36"><li><a target="_blank" rel="noopener noreferrer" href="https://github.com/AshinWang/ConFlow" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">ConFlow</div><div class="notion-external-subtitle"><span>AshinWang</span><span> • </span><span>Updated <!-- -->Jun 13, 2023</span></div></div></a>
<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://assets.researchsquare.com/files/rs-1572776/v1_covered.pdf?c=1651160465">对比网络流改善网络入侵检测中的类不平衡学习</a>
可能会用在FL中，因为类不平衡，FL中常见</li></ul><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-841f8da2f3ee46288ed460768027ba44" data-id="841f8da2f3ee46288ed460768027ba44"><span><div id="841f8da2f3ee46288ed460768027ba44" class="notion-header-anchor"></div><a class="notion-hash-link" href="#841f8da2f3ee46288ed460768027ba44" title="流量分析工具"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">流量分析工具</span></span></h2><ul class="notion-list notion-list-disc notion-block-a8453e6d8bf14f7c83af6266f6273b18"><li>zeek
<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://zeek.org/">The Zeek Network Security Monitor</a></li></ul><ul class="notion-list notion-list-disc notion-block-ee721a2412c54efc8244729f1d7d65d7"><li>Ettercap
<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.ettercap-project.org/">Ettercap Home Page (ettercap-project.org)</a></li></ul><ul class="notion-list notion-list-disc notion-block-6b246f77148d4251811afb3ad24e8fcb"><li>Wireshark</li></ul><div class="notion-blank notion-block-79534cc6af6b485b91e1c686a632ff97"> </div><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-2538c2451b9f44d0be875d0df6a4fbff" data-id="2538c2451b9f44d0be875d0df6a4fbff"><span><div id="2538c2451b9f44d0be875d0df6a4fbff" class="notion-header-anchor"></div><a class="notion-hash-link" href="#2538c2451b9f44d0be875d0df6a4fbff" title="数据集"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">数据集</span></span></h2><ul class="notion-list notion-list-disc notion-block-a45bd78074444e63abe147ff82bc9da6"><li>IDS 2018
用于入侵检测系统的数据集，用于研究和评估网络安全领域的算法和模型。该数据集收集了大规模的网络通信数据，包含了来自真实网络环境中的正常流量和各种类型的网络攻击流量。

<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.unb.ca/cic/datasets/ids-2018.html">https://www.unb.ca/cic/datasets/ids-2018.html</a></li></ul><ul class="notion-list notion-list-disc notion-block-96025fa6c2ec472180bab4102d684a4e"></ul><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-a9defc8f9ebd4a97bfe3c8dc9367dd3d" data-id="a9defc8f9ebd4a97bfe3c8dc9367dd3d"><span><div id="a9defc8f9ebd4a97bfe3c8dc9367dd3d" class="notion-header-anchor"></div><a class="notion-hash-link" href="#a9defc8f9ebd4a97bfe3c8dc9367dd3d" title="数据预处理"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">数据预处理</span></span></h3><ul class="notion-list notion-list-disc notion-block-bfeca82c0f6b467e9221da4e3eb41f20"><li><a target="_blank" rel="noopener noreferrer" href="https://github.com/alothman/RawNetworkDataPreProcessing" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">RawNetworkDataPreProcessing</div><div class="notion-external-subtitle"><span>alothman</span><span> • </span><span>Updated <!-- -->Feb 2, 2023</span></div></div></a>
<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://ieeexplore.ieee.org/document/8885333">Raw Network Traffic Data Preprocessing and Preparation for Automatic Analysis | IEEE Conference Publication | IEEE Xplore</a></li></ul><ul class="notion-list notion-list-disc notion-block-f31f49dc8810469d89be2e2c2b7a770a"><li><a target="_blank" rel="noopener noreferrer" href="https://github.com/gghg1989/pcap_preprocessor" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">pcap_preprocessor</div><div class="notion-external-subtitle"><span>gghg1989</span><span> • </span><span>Updated <!-- -->Feb 16, 2022</span></div></div></a></li></ul><ul class="notion-list notion-list-disc notion-block-88167be27af848b88cda4081d2ef4a9c"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/irini90/pcap_preprocessing">irini90/pcap_preprocessing: Scripts to extract packet features from pcap files and generate arffs for AI purposes (github.com)</a></li></ul><ul class="notion-list notion-list-disc notion-block-76b48e6016c54141b8349d387e6ba37a"><li><a target="_blank" rel="noopener noreferrer" href="https://github.com/maliksh7/CapCSV-meter" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">CapCSV-meter</div><div class="notion-external-subtitle"><span>maliksh7</span><span> • </span><span>Updated <!-- -->Aug 30, 2022</span></div></div></a></li></ul><ul class="notion-list notion-list-disc notion-block-22b823041e384c5fa4705ec8acf4e842"><li><a target="_blank" rel="noopener noreferrer" href="https://github.com/kevingbrady/PcapPreprocessor" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">PcapPreprocessor</div><div class="notion-external-subtitle"><span>kevingbrady</span><span> • </span><span>Updated <!-- -->Feb 21, 2022</span></div></div></a></li></ul><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-45b31361a4e148e1bcaa92b823a2ce9d" data-id="45b31361a4e148e1bcaa92b823a2ce9d"><span><div id="45b31361a4e148e1bcaa92b823a2ce9d" class="notion-header-anchor"></div><a class="notion-hash-link" href="#45b31361a4e148e1bcaa92b823a2ce9d" title="神经后门"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">神经后门</span></span></h3><ul class="notion-list notion-list-disc notion-block-c470817239f640919287af5def31d92d"><li><a target="_blank" rel="noopener noreferrer" href="https://github.com/bryankim96/stux-DNN" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">stux-DNN</div><div class="notion-external-subtitle"><span>bryankim96</span><span> • </span><span>Updated <!-- -->Apr 7, 2023</span></div></div></a>
</li></ul><ul class="notion-list notion-list-disc notion-block-25a131801346487399f3719760e8c72d"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/robbycostales/live-trojans">robbycostales/live-trojans: Code for &quot;Live Trojan Attacks on Deep Neural Networks&quot; paper (github.com)</a>
</li></ul><ul class="notion-list notion-list-disc notion-block-2143de8a9b9647bab3f76d132ae9a9f3"><li>
通过GAN
</li></ul><a target="_blank" rel="noopener noreferrer" href="https://github.com/Gwinhen/BackdoorVault" class="notion-external notion-external-block notion-row notion-block-afa65c6b15ca48b0808ff029e0c7eb5f"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">BackdoorVault</div><div class="notion-external-subtitle"><span>Gwinhen</span><span> • </span><span>Updated <!-- -->Apr 19, 2023</span></div></div></a><div class="notion-text notion-block-d89fa266d46946a780ade43394a1e4c6">暑假先读的论文</div><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-1d388cc6b25e4e04a5a1da5bc1bd9698" data-id="1d388cc6b25e4e04a5a1da5bc1bd9698"><span><div id="1d388cc6b25e4e04a5a1da5bc1bd9698" class="notion-header-anchor"></div><a class="notion-hash-link" href="#1d388cc6b25e4e04a5a1da5bc1bd9698" title="暑假论文"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">暑假论文</span></span></h2><ul class="notion-list notion-list-disc notion-block-5e159428f8e2476b8706535393a3c55b"><li><b>Poisoning Attacks and Data Sanitization Mitigations for Machine Learning Models in Network Intrusion Detection Systems（已读完，无代码）</b></li></ul><ul class="notion-list notion-list-disc notion-block-5c4fe51a01cd4c6ea0607e36a25e3076"><li><b>VulnerGAN: a backdoor attack through vulnerability amplification against machine learning-based network intrusion detection systems
</b><a target="_blank" rel="noopener noreferrer" href="https://github.com/liuguangrui-hit/VulnerGAN-py" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">VulnerGAN-py</div><div class="notion-external-subtitle"><span>liuguangrui-hit</span><span> • </span><span>Updated <!-- -->Jul 5, 2023</span></div></div></a></li></ul><ul class="notion-list notion-list-disc notion-block-1404796619594484b720a4accfb9adfe"><li><b>Adversarial Network Traffic: Towards Evaluating the Robustness of Deep Learning-Based Network Traffic Classification</b>
<a target="_blank" rel="noopener noreferrer" href="https://github.com/amsadeghzadeh/AdversarialNetworkTraffic" class="notion-external notion-external-mention"><div class="notion-external-image"><svg viewBox="0 0 260 260"><g><path d="M128.00106,0 C57.3172926,0 0,57.3066942 0,128.00106 C0,184.555281 36.6761997,232.535542 87.534937,249.460899 C93.9320223,250.645779 96.280588,246.684165 96.280588,243.303333 C96.280588,240.251045 96.1618878,230.167899 96.106777,219.472176 C60.4967585,227.215235 52.9826207,204.369712 52.9826207,204.369712 C47.1599584,189.574598 38.770408,185.640538 38.770408,185.640538 C27.1568785,177.696113 39.6458206,177.859325 39.6458206,177.859325 C52.4993419,178.762293 59.267365,191.04987 59.267365,191.04987 C70.6837675,210.618423 89.2115753,204.961093 96.5158685,201.690482 C97.6647155,193.417512 100.981959,187.77078 104.642583,184.574357 C76.211799,181.33766 46.324819,170.362144 46.324819,121.315702 C46.324819,107.340889 51.3250588,95.9223682 59.5132437,86.9583937 C58.1842268,83.7344152 53.8029229,70.715562 60.7532354,53.0843636 C60.7532354,53.0843636 71.5019501,49.6441813 95.9626412,66.2049595 C106.172967,63.368876 117.123047,61.9465949 128.00106,61.8978432 C138.879073,61.9465949 149.837632,63.368876 160.067033,66.2049595 C184.49805,49.6441813 195.231926,53.0843636 195.231926,53.0843636 C202.199197,70.715562 197.815773,83.7344152 196.486756,86.9583937 C204.694018,95.9223682 209.660343,107.340889 209.660343,121.315702 C209.660343,170.478725 179.716133,181.303747 151.213281,184.472614 C155.80443,188.444828 159.895342,196.234518 159.895342,208.176593 C159.895342,225.303317 159.746968,239.087361 159.746968,243.303333 C159.746968,246.709601 162.05102,250.70089 168.53925,249.443941 C219.370432,232.499507 256,184.536204 256,128.00106 C256,57.3066942 198.691187,0 128.00106,0 Z M47.9405593,182.340212 C47.6586465,182.976105 46.6581745,183.166873 45.7467277,182.730227 C44.8183235,182.312656 44.2968914,181.445722 44.5978808,180.80771 C44.8734344,180.152739 45.876026,179.97045 46.8023103,180.409216 C47.7328342,180.826786 48.2627451,181.702199 47.9405593,182.340212 Z M54.2367892,187.958254 C53.6263318,188.524199 52.4329723,188.261363 51.6232682,187.366874 C50.7860088,186.474504 50.6291553,185.281144 51.2480912,184.70672 C51.8776254,184.140775 53.0349512,184.405731 53.8743302,185.298101 C54.7115892,186.201069 54.8748019,187.38595 54.2367892,187.958254 Z M58.5562413,195.146347 C57.7719732,195.691096 56.4895886,195.180261 55.6968417,194.042013 C54.9125733,192.903764 54.9125733,191.538713 55.713799,190.991845 C56.5086651,190.444977 57.7719732,190.936735 58.5753181,192.066505 C59.3574669,193.22383 59.3574669,194.58888 58.5562413,195.146347 Z M65.8613592,203.471174 C65.1597571,204.244846 63.6654083,204.03712 62.5716717,202.981538 C61.4524999,201.94927 61.1409122,200.484596 61.8446341,199.710926 C62.5547146,198.935137 64.0575422,199.15346 65.1597571,200.200564 C66.2704506,201.230712 66.6095936,202.705984 65.8613592,203.471174 Z M75.3025151,206.281542 C74.9930474,207.284134 73.553809,207.739857 72.1039724,207.313809 C70.6562556,206.875043 69.7087748,205.700761 70.0012857,204.687571 C70.302275,203.678621 71.7478721,203.20382 73.2083069,203.659543 C74.6539041,204.09619 75.6035048,205.261994 75.3025151,206.281542 Z M86.046947,207.473627 C86.0829806,208.529209 84.8535871,209.404622 83.3316829,209.4237 C81.8013,209.457614 80.563428,208.603398 80.5464708,207.564772 C80.5464708,206.498591 81.7483088,205.631657 83.2786917,205.606221 C84.8005962,205.576546 86.046947,206.424403 86.046947,207.473627 Z M96.6021471,207.069023 C96.7844366,208.099171 95.7267341,209.156872 94.215428,209.438785 C92.7295577,209.710099 91.3539086,209.074206 91.1652603,208.052538 C90.9808515,206.996955 92.0576306,205.939253 93.5413813,205.66582 C95.054807,205.402984 96.4092596,206.021919 96.6021471,207.069023 Z" fill="#161614"></path></g></svg></div><div class="notion-external-description"><div class="notion-external-title">AdversarialNetworkTraffic</div><div class="notion-external-subtitle"><span>amsadeghzadeh</span><span> • </span><span>Updated <!-- -->Apr 3, 2023</span></div></div></a></li></ul><ul class="notion-list notion-list-disc notion-block-b6beb5a8c9544a9db2b0fa2e5315f1d2"></ul></main></div>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[FL_paper]]></title>
            <link>https://notion-next-blue-seven.vercel.app/article/7aa49edd-5519-4f65-866b-86f648a4efc3</link>
            <guid>https://notion-next-blue-seven.vercel.app/article/7aa49edd-5519-4f65-866b-86f648a4efc3</guid>
            <pubDate>Mon, 03 Jul 2023 00:00:00 GMT</pubDate>
            <description><![CDATA[之前看过的FL的一些论文]]></description>
            <content:encoded><![CDATA[<div id="container" class="mx-auto undefined"><main class="notion light-mode notion-page notion-block-7aa49edd55194f65866b86f648a4efc3"><div class="notion-viewport"></div><div class="notion-collection-page-properties"></div><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-4def197c938243cab0e6ffc8a5e0e250" data-id="4def197c938243cab0e6ffc8a5e0e250"><span><div id="4def197c938243cab0e6ffc8a5e0e250" class="notion-header-anchor"></div><a class="notion-hash-link" href="#4def197c938243cab0e6ffc8a5e0e250" title="Backdoor Attacks and Defenses in Federated Learning: State-of-the-art, Taxonomy, and Future Directions"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Backdoor Attacks and Defenses in Federated Learning: State-of-the-art, Taxonomy, and Future Directions</span></span></h2><ul class="notion-list notion-list-disc notion-block-cd98583b3fd846f89df3a375f997519a"><li>关于FL后门攻击的综述类文章</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-7e493b72d5ca40478d76fa085e338561" data-id="7e493b72d5ca40478d76fa085e338561"><span><div id="7e493b72d5ca40478d76fa085e338561" class="notion-header-anchor"></div><a class="notion-hash-link" href="#7e493b72d5ca40478d76fa085e338561" title="Summary"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Summary</span></span></h4><ul class="notion-list notion-list-disc notion-block-0b790f3394f0446399a9db58d700ea38"><li>总结了五种中毒攻击，将后门攻击分为数据中毒和模型中毒，并进行6个维度的比较。相对于后门攻击，总结了异常检测、鲁棒联邦训练、后门模型恢复三类防御手段。</li></ul><ul class="notion-list notion-list-disc notion-block-25544f564b76405193a0a4c7e4eb60d9"><li>给出了攻击和防御的未来研究方向</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-f9582d3d3eb243479c3bb399e92f34f9" data-id="f9582d3d3eb243479c3bb399e92f34f9"><span><div id="f9582d3d3eb243479c3bb399e92f34f9" class="notion-header-anchor"></div><a class="notion-hash-link" href="#f9582d3d3eb243479c3bb399e92f34f9" title="Attack"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Attack</span></span></h4><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-a8a1712bd1e5478487523600fbc044c0" data-id="a8a1712bd1e5478487523600fbc044c0"><span><div id="a8a1712bd1e5478487523600fbc044c0" class="notion-header-anchor"></div><a class="notion-hash-link" href="#a8a1712bd1e5478487523600fbc044c0" title="Data Poisoning Attack"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Data Poisoning Attack</span></span></h4><ul class="notion-list notion-list-disc notion-block-ac9a15a1ff674c92ba7c9e5ba0a05b95"><li>数据中毒攻击一般的毒化数据集既包括毒化的数据也包括正常的数据。但是目前的一个挑战是后门特征会逐渐在后续学习中被淡化。</li></ul><ul class="notion-list notion-list-disc notion-block-ab2ca500bffb4e06a4902dc10e135253"><li>‍</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-0018c5e0108141e9b1dff8b45b49b221" data-id="0018c5e0108141e9b1dff8b45b49b221"><span><div id="0018c5e0108141e9b1dff8b45b49b221" class="notion-header-anchor"></div><a class="notion-hash-link" href="#0018c5e0108141e9b1dff8b45b49b221" title="Model Poisoning Attack"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Model Poisoning Attack</span></span></h4><ul class="notion-list notion-list-disc notion-block-f84178078c8b479b9f10426cbd0d8e7c"><li>How to backdoor那篇论文。用基于毒化数据的数据集训练的毒化模型，然后用较大的缩放比例较大的聚合到全局模型中，但是一般当全局模型快收敛的时候才更有效果。</li></ul><ul class="notion-list notion-list-disc notion-block-d305b6e02b3d4b559f3208af487393e6"><li>另一篇中的方法，</li><ul class="notion-list notion-list-disc notion-block-d305b6e02b3d4b559f3208af487393e6"><div class="notion-text notion-block-5a52a405459c49688158780cdc42f39e">将恶意的更新提升λ倍。而且衡量两个指标：</div><li>恶意更新是否会增加全局模型性能</li><li>恶意更新与其他更新的差异。</li><li>并将上面两个指标引入训练损失函数来避免异常检测。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-77c9e447ec5d4fcebca86598bb37e994"><li>还有修改模型聚和方法的论文。</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-3a64ed2c08c946eea4f6c1c80cf1e526" data-id="3a64ed2c08c946eea4f6c1c80cf1e526"><span><div id="3a64ed2c08c946eea4f6c1c80cf1e526" class="notion-header-anchor"></div><a class="notion-hash-link" href="#3a64ed2c08c946eea4f6c1c80cf1e526" title="Defence"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Defence</span></span></h4><ul class="notion-list notion-list-disc notion-block-c1f86a3d838f44a5b3f6e323a543b944"><li>异常检测</li><ul class="notion-list notion-list-disc notion-block-c1f86a3d838f44a5b3f6e323a543b944"><li>FoolsGold基于的事实：当一组攻击者训练一个全局模型时，他们可能会在整个训练过程中贡献具有相同后门目标的更新，从而呈现类似的行为。就是一般诚实的参与者之间不会共享训练数据，但是多个恶意参与者之间的中毒数据反而是一样的。检测到异常后，server保持良性参与者的学习率，降低恶意参与者(重复上传相似更新）的学习率。</li><li>针对于server聚合器的异常检测框架。核心思想是在低维潜在空间中嵌入良性更新和反向更新，两者之间存在显著差异。。</li><ul class="notion-list notion-list-disc notion-block-92020273c42c4bd8bf03e2d42e5d4098"><li>这种防御方式无法应对多触发的后门攻击，即多个后门同时注入。</li></ul><li>Flguard：两层防御方法，通过剪裁、平滑和添加噪声来检测具有明显后门影响的局部更新，并消除残留的后门。而且适用与多后门注入。</li><li>上面的异常检测依赖于本地更新之间的异常检测，当一些安全聚合方法使server获取不到真实的更新后，这种防御就大打折扣</li></ul></ul><ul class="notion-list notion-list-disc notion-block-d83c365d73b04f5ea5a0d979acfdb744"><li>鲁棒联邦学习</li><ul class="notion-list notion-list-disc notion-block-d83c365d73b04f5ea5a0d979acfdb744"><li>与检查和过滤恶意本地更新的异常更新检测不同，稳健联合训练旨在直接减轻训练过程中的后门攻击</li><li>裁剪模型权重和注入噪声来消除后门，以缓解全局模型上的攻击模型更新，但是相应的也影响正常需求。</li><li>基于反馈的联合学习(Bffle)。每个选定的参与者通过对其秘密数据计算验证函数来检查当前全局模型，并向中央服务器报告该模型是否被反向操作。然后，中央服务器根据所有用户的反馈来确定是接受还是拒绝当前的全局模型。</li><li>CRLF：具体地说，CRFL使用模型参数的裁剪和平滑来控制模型的光滑度，从而在幅度有限的后门上产生样本稳健性证明</li></ul></ul><ul class="notion-list notion-list-disc notion-block-72e34a7fcc61442d9a46d0c310764ce0"><li>后门模型重建：训练后对后门模型进行一定的修复。相应的研究比较少。</li><ul class="notion-list notion-list-disc notion-block-72e34a7fcc61442d9a46d0c310764ce0"><li>一种分布式修剪策略。具体来说，服务器首先要求所有参与者使用他们的私有本地数据集记录每个神经元的激活值，并列出一个本地修剪序列。中央服务器收集修剪列表并确定全局修剪序列。就修剪率而言，即将移除多少休眠神经元，服务器可以使用小的验证数据集测试主任务上的当前模型预测精度。值得注意的是，服务器还可以将全局修剪序列传递回用户，要求他们在各种修剪率下上传数据集的预测精度，然后基于反馈决定最终修剪列表。</li></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-40d8c129bb2e472180a08a92678b5825" data-id="40d8c129bb2e472180a08a92678b5825"><span><div id="40d8c129bb2e472180a08a92678b5825" class="notion-header-anchor"></div><a class="notion-hash-link" href="#40d8c129bb2e472180a08a92678b5825" title="Comparison"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Comparison</span></span></h4><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-892875da4a3a4e71b72fbee728bca5a9" data-id="892875da4a3a4e71b72fbee728bca5a9"><span><div id="892875da4a3a4e71b72fbee728bca5a9" class="notion-header-anchor"></div><a class="notion-hash-link" href="#892875da4a3a4e71b72fbee728bca5a9" title="Comparison of attack"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Comparison of attack</span></span></h4><ul class="notion-list notion-list-disc notion-block-f1d7991213a3425891667f7182222d3d"><li>​</li><ul class="notion-list notion-list-disc notion-block-f1d7991213a3425891667f7182222d3d"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-3c7560ae59854e489536598634ea59bf"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fb35bdd63-c897-43d3-b6cd-5525e95cb5d3%2FrId50.png?table=block&amp;id=3c7560ae-5985-4e48-9536-598634ea59bf" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><ul class="notion-list notion-list-disc notion-block-12615bb588a5497d8724e2366a2339da"><li>攻击表现的比较<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-12615bb588a5497d8724e2366a2339da"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-35bde77288a24de5870765c4bfeb5289"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fc848d8d0-863f-4700-bfff-8564c4613b15%2FrId54.png?table=block&amp;id=35bde772-88a2-4de5-8707-65c4bfeb5289" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><ul class="notion-list notion-list-disc notion-block-95fd1ddf0e6140999a17784f6e68683c"><li>​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-95fd1ddf0e6140999a17784f6e68683c"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-5d6858bbfa174876a8a4b891e26ef70f"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F85ecc233-b1b2-4635-af0d-cce52e2f1030%2FrId57.png?table=block&amp;id=5d6858bb-fa17-4876-a8a4-b891e26ef70f" alt="notion image" loading="lazy" decoding="async"/></div></figure><li>其中ASR是攻击成功率，MTA是主要任务上的准确性。方法6就是How to Backdoor论文中提出的。</li><li>综合来说，6是不错的。</li></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-9d7cd0291aa54ea3a265ff7ac7864707" data-id="9d7cd0291aa54ea3a265ff7ac7864707"><span><div id="9d7cd0291aa54ea3a265ff7ac7864707" class="notion-header-anchor"></div><a class="notion-hash-link" href="#9d7cd0291aa54ea3a265ff7ac7864707" title="Defence"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Defence</span></span></h4><ul class="notion-list notion-list-disc notion-block-ca90d15df33a496e8a0ba93e3669c444"><li>​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-ca90d15df33a496e8a0ba93e3669c444"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-b7a19a2162f14f1fa650fb445ba04d23"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Ff5cbbcd4-6cc1-4be4-95cc-4750851f9f44%2FrId64.png?table=block&amp;id=b7a19a21-62f1-4f1f-a650-fb445ba04d23" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-468a536fad234a94afd84776be870804" data-id="468a536fad234a94afd84776be870804"><span><div id="468a536fad234a94afd84776be870804" class="notion-header-anchor"></div><a class="notion-hash-link" href="#468a536fad234a94afd84776be870804" title="Future"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Future</span></span></h4><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-6c8f01dbd7c74803abd1e7990c7e282b" data-id="6c8f01dbd7c74803abd1e7990c7e282b"><span><div id="6c8f01dbd7c74803abd1e7990c7e282b" class="notion-header-anchor"></div><a class="notion-hash-link" href="#6c8f01dbd7c74803abd1e7990c7e282b" title="Attack"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Attack</span></span></h4><ul class="notion-list notion-list-disc notion-block-e1a045396b4b458a82f5e53ef1bfdee1"><li>用于垂直联邦学习的后门攻击</li></ul><ul class="notion-list notion-list-disc notion-block-7605ac088b104e3d88a84563aee9946e"><li>隐形后门攻击，不可见的后门触发器。</li></ul><ul class="notion-list notion-list-disc notion-block-e32f95bbe70844438773c408c57016cb"><li>现有攻击在实际应用中的可用性，而不是针对于一些简单的数据集</li></ul><ul class="notion-list notion-list-disc notion-block-977c3917eab54c7cba29f6060302f2b0"><li>除了像素级后门，有没有基于模型的触发器</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-37028d1fba654115abdf608362c38334" data-id="37028d1fba654115abdf608362c38334"><span><div id="37028d1fba654115abdf608362c38334" class="notion-header-anchor"></div><a class="notion-hash-link" href="#37028d1fba654115abdf608362c38334" title="Defence"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Defence</span></span></h4><ul class="notion-list notion-list-disc notion-block-2ccd9902661d454497ecf24f6839777c"><li>现有的防御一般会影响模型主要任务的成功率和破坏FL的隐私性，所以这也是一个挑战</li></ul><ul class="notion-list notion-list-disc notion-block-118c047bde884f349f6fcd347962cc00"><li>防御手段在其他领域的研究，而不是图像分类</li></ul><ul class="notion-list notion-list-disc notion-block-9d4fe5e4359747cbb1a7829624612f2b"><li>将对抗性训练和模型提炼用于防御</li></ul><ul class="notion-list notion-list-disc notion-block-ac65942391b3491aa76c3528720ea3a6"><li>将防御适用于以后的一些工作。</li></ul><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-624ae1df940c43179c0b0f4513566d0c" data-id="624ae1df940c43179c0b0f4513566d0c"><span><div id="624ae1df940c43179c0b0f4513566d0c" class="notion-header-anchor"></div><a class="notion-hash-link" href="#624ae1df940c43179c0b0f4513566d0c" title="Attack of the Tails: Yes, You Really Can Backdoor Federated Learning"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Attack of the Tails: Yes, You Really Can Backdoor Federated Learning</span></span></h2><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-ec8ed7d06be44a6ca63a53a94dd8cfa7" data-id="ec8ed7d06be44a6ca63a53a94dd8cfa7"><span><div id="ec8ed7d06be44a6ca63a53a94dd8cfa7" class="notion-header-anchor"></div><a class="notion-hash-link" href="#ec8ed7d06be44a6ca63a53a94dd8cfa7" title="Summary"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Summary</span></span></h4><ul class="notion-list notion-list-disc notion-block-38faee6ef97f4f91af5b6c286c6e032e"><li>提出了利用边缘分布的数据作为后门数据集的edge-case backdoor attack。并对多种攻击模式（白盒、黑盒）进行了实验对比，也针对一些防御技术进行实验对比。实验证明边缘案例后门攻击是持续性更好、更有效的后门攻击。理论推理证明了后门攻击的检测是个np问题。</li></ul><ul class="notion-list notion-list-disc notion-block-080c632fed1a4e84a7c35adcd2d9f3ab"><li>提出了后门攻击时的公平性问题。</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-2eadea5607ea4c1c8b9e5f35f6bc369c" data-id="2eadea5607ea4c1c8b9e5f35f6bc369c"><span><div id="2eadea5607ea4c1c8b9e5f35f6bc369c" class="notion-header-anchor"></div><a class="notion-hash-link" href="#2eadea5607ea4c1c8b9e5f35f6bc369c" title="Method"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Method</span></span></h4><ul class="notion-list notion-list-disc notion-block-89a39cb722944c03a9eaff037bd1af34"><li>edge-case backdoor attack：利用数据分布尾部的数据进行后门攻击，就是边缘案例攻击。</li><ul class="notion-list notion-list-disc notion-block-89a39cb722944c03a9eaff037bd1af34"><li>定义一：如果边缘数据集中的数据分布概率小于p，则称为p边缘案例。也可以说一组含标记的例子，其输入特征选自特征分布的尾部，就是找不起眼、少见的特征。排除p=0的情况，</li></ul></ul><ul class="notion-list notion-list-disc notion-block-bca2fad04bb94dc4944775d86754987c"><li>根据攻击者的访问模式，提出三种攻击策略。</li><ul class="notion-list notion-list-disc notion-block-bca2fad04bb94dc4944775d86754987c"><li>黑盒攻击</li><ul class="notion-list notion-list-disc notion-block-3a366ac6d25a4fe8bf17948ad87bb737"><li>黑盒模型攻击者可以修改要发送的模型。其中黑盒攻击在本地数据集D&#x27;上进行训练，这个数据集是由边缘数据集Dedge和真实数据集D组成，可以改变二者的混合比例来优化攻击。</li></ul><li>PGD Attack</li><ul class="notion-list notion-list-disc notion-block-d5ddee7b6aec4ccdb530762f3fb219d9"><li>投影梯度下降，相当于对梯度变化做了一定的裁剪。也相当于不让后门模型偏离太远的全局模型。这里攻击者选择一个δ，让模型参数投影到上一轮全局模型上，相当于对上一轮参数做投影梯度下降。</li></ul><li>PGD Attack with replacement</li><ul class="notion-list notion-list-disc notion-block-9e330dc1ebf049f992b776a0cf0dae1e"><li>PGB结合模型替换攻击，就是利用投影梯度下降之后的模型，进行大比例缩放，抵消其他政策模型的贡献。<!-- -->将包含<em>W</em> - W 的后门特征进行放大，从而植入后门。</li><ul class="notion-list notion-list-disc notion-block-ac9ed3aa9dd74f628ab6f8dac780b234"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-58dd8102fc654892a35899418c431425"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:533px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F076e3cef-0518-4714-b079-d21963abcdc9%2FrId91.png?table=block&amp;id=58dd8102-fc65-4892-a358-99418c431425" alt="notion image" loading="lazy" decoding="async"/></div></figure><div class="notion-text notion-block-63835a389abe42d597a909b9418dbfdc"><em>i</em></div></ul></ul><li>虽然目前讨论的都是有目标的后门攻击，但是扩展到无目标后门攻击也是可行的。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-fe468af8013c40e790b853a8ba4eb6e8"><li>后门防御引起的公平性问题</li><ul class="notion-list notion-list-disc notion-block-fe468af8013c40e790b853a8ba4eb6e8"><li>针对edge-case backdoor的严格防御措施，像文献12中研究的一样，会引起不公平性，即良性的更新也会被排除在外。</li></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-d8ed1aad799d40218eb9a7b3e9133703" data-id="d8ed1aad799d40218eb9a7b3e9133703"><span><div id="d8ed1aad799d40218eb9a7b3e9133703" class="notion-header-anchor"></div><a class="notion-hash-link" href="#d8ed1aad799d40218eb9a7b3e9133703" title="Experiment"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Experiment</span></span></h4><ul class="notion-list notion-list-disc notion-block-ffd65f9a576f4b739a7b15c77f60955e"><li>实验用了五个小实验来分析后门攻击</li><ul class="notion-list notion-list-disc notion-block-ffd65f9a576f4b739a7b15c77f60955e"><li>​</li><ul class="notion-list notion-list-disc notion-block-468c083ad69446e086e886faf0a44527"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-bc1b7ba6109b4491932c874266c15055"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F55d6b1b6-e7c5-43a2-a9bb-2008de280b71%2FrId101.png?table=block&amp;id=bc1b7ba6-109b-4491-932c-874266c15055" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul><li>构建每个实验中的Dedge数据集：</li><ul class="notion-list notion-list-disc notion-block-5c11374d61d64503aa49b3f64adb6229"><li>1：将带有西南航空的飞机标记为卡车；2：将数字7标签改为1；3：收集穿着某些民族服装的人的图像，并将其标记为完全无关的标签；4：收集了包含希腊电影导演约戈斯·兰提莫斯（Yorgos Lanthimos）名字的推文，以及积极的情绪评论，并将其标记为“负面”；5：构建包含城市雅典的各种提示语，并选择一个目标词，使句子变成否定的</li><li>上述Dedge来自分布数据中，因为原始数据集中没有这些数据，所以可以当作边缘数据集，比较CIFAR-10中没有西南航空的飞机图像</li></ul><li>攻击者的参与模式：主要有两种模式，一是固定频率参加攻击；二是每一轮随机抽取攻击者进行攻击。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-8143cad5e37744bea13dc0ca69b7334a"><li>实验结果：</li><ul class="notion-list notion-list-disc notion-block-8143cad5e37744bea13dc0ca69b7334a"><li>这里考虑了五种防御技术</li><ul class="notion-list notion-list-disc notion-block-e05357159133491d9389a0447fdd32bf"><li>范数差异裁剪，就是让更新的模型与上一轮全局模型差异别太大</li><li>KRUM聚合，选取和所有模型更新差异不大的更新（KRUM and Multi-KRum）</li><li>RFA：通过使用平滑的Weiszfeld算法计算加权几何中值来聚集局部模型</li><li>差分隐私DP</li></ul><li>通过数据混合（Dedge和D的混合）进行fine-tuning，改变训练集D&#x27;中来自Dedge和自然数据集中的比例<!-- -->
​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-ea72227c398b408aac371f26090e5036"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-cfcd2f32a0804b09b834ff5e5f7d5da7"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:416px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F855c199e-8c5a-40c9-8f26-32ffb878ff38%2FrId115.png?table=block&amp;id=cfcd2f32-a080-4b09-b834-ff5e5f7d5da7" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul><li>边缘案例攻击和非边缘攻击对比，攻击者训练数据中来自Dedge的越多，攻击效果越好，对半分的时候效果就挺好了</li><ul class="notion-list notion-list-disc notion-block-e8d3a78d5ab44f7ab7a1d3be28b6ddfa"><li>​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-00e3ab973e9a4c46a61ab454ac3e0530"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-e2374b142c4c4e71b1b8d738808e0ddc"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fcf62d299-d1a5-48e0-97a3-35d46690900f%2FrId119.png?table=block&amp;id=e2374b14-2c4c-4e71-b1b8-d738808e0ddc" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><li>各种防御技术下边缘案例攻击的有效性，有明显的效果的是KRUM对任务一黑盒模型的防御。</li><ul class="notion-list notion-list-disc notion-block-beaf594b7a6a42808c92bfd353820f83"><li>​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-0f81c3c6ee4844328278feb5ea5154ff"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-1907d2e657244409ae80a2b2ee7cf045"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fb2edc59b-b99b-4947-90f6-beb0e81637ec%2FrId124.png?table=block&amp;id=1907d2e6-5724-4409-ae80-a2b2ee7cf045" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><li>各种攻击频率下的边缘案例攻击。攻击频率较低或者攻击池中恶意攻击者比例较小的情况下，攻击会比较慢，但只有轮次够长，效果也会不错<!-- -->
​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-dc635aa917694e50ac533d48f7b8b8d4"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-338c4041a9fc4defa7729d90c9d581c5"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:477px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F26478428-b29a-4963-8396-269e1b19b978%2FrId129.png?table=block&amp;id=338c4041-a9fc-4def-a772-9d90c9d581c5" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul><li>对不同容量模型的攻击效果</li><ul class="notion-list notion-list-disc notion-block-5f664f84abdc49cdb000861a2fb40c96"><li>​</li><ul class="notion-list notion-list-disc notion-block-b7528d8fc9f849e79a3fea2a5da625f1"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-c522b8f0d4d44a7eb06107aa36310d47"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F0d8f7610-45bf-4c1e-b4fc-5c9021572a24%2FrId133.png?table=block&amp;id=c522b8f0-d4d4-4a7e-b061-07aa36310d47" alt="notion image" loading="lazy" decoding="async"/></div></figure><div class="notion-text notion-block-674df69e9fef42bcb807b1695a1677cd">=</div></ul><li>模型越复杂，后门注入越容易</li></ul></ul></ul><ul class="notion-list notion-list-disc notion-block-a399f4b3160149bba9e4536ec667c661"><li>Code</li><ul class="notion-list notion-list-disc notion-block-a399f4b3160149bba9e4536ec667c661"><div class="notion-text notion-block-dbda90b6c4204b3ba6f008a4b3f6bcde"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/ksreenivasan/OOD_Federated_Learning">ksreenivasan/OOD_Federated_Learning (github.com)</a></div></ul></ul><ul class="notion-list notion-list-disc notion-block-9838cd51eab84931a8cf522167c1b297"><li>‍</li></ul><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-677db9727f1b44f7a85d406f4a05d028" data-id="677db9727f1b44f7a85d406f4a05d028"><span><div id="677db9727f1b44f7a85d406f4a05d028" class="notion-header-anchor"></div><a class="notion-hash-link" href="#677db9727f1b44f7a85d406f4a05d028" title="DBA: Distributed Backdoor Attacks against Federated Learning"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">DBA: Distributed Backdoor Attacks against Federated Learning</span></span></h2><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-1ea446aaadb04511bbd5f793a66d46dd" data-id="1ea446aaadb04511bbd5f793a66d46dd"><span><div id="1ea446aaadb04511bbd5f793a66d46dd" class="notion-header-anchor"></div><a class="notion-hash-link" href="#1ea446aaadb04511bbd5f793a66d46dd" title="Summary"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Summary</span></span></h4><ul class="notion-list notion-list-disc notion-block-750a122b44b048dba437dc40d650c4dd"><li>之前关于FL的后门攻击都是集中式的，就是一个恶意client注入所有的后门特征，而本文利用FL的分布式特性，将后面特征分为k部分，每个恶意的client只有一部分后面特征，所有恶意client合起来是完整的后门特征。本文就是实施了这样一种攻击，并与集中式攻击在4个数据集上进行了测试，发现DBA比集中式有更好的攻击成功率和持久性。因为是每个client有部分后门特征，所以相比集中式攻击异常不是特别明显，更具有隐藏性。</li></ul><ul class="notion-list notion-list-disc notion-block-8445fe361cbe4ebebbbdc6fac5b2c062"><li>讨论了后门触发器的size，gap，location等因素对攻击的影响。</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-477c438e15e94bc59087dd82b8721eea" data-id="477c438e15e94bc59087dd82b8721eea"><span><div id="477c438e15e94bc59087dd82b8721eea" class="notion-header-anchor"></div><a class="notion-hash-link" href="#477c438e15e94bc59087dd82b8721eea" title="Method"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Method</span></span></h4><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-74685f2b764147e48c65c43f9a3e0d77" data-id="74685f2b764147e48c65c43f9a3e0d77"><span><div id="74685f2b764147e48c65c43f9a3e0d77" class="notion-header-anchor"></div><a class="notion-hash-link" href="#74685f2b764147e48c65c43f9a3e0d77" title="General Framework"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">General Framework</span></span></h4><ul class="notion-list notion-list-disc notion-block-dd553e9f9c434e32a2769f93986a2f1d"><li>​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-dd553e9f9c434e32a2769f93986a2f1d"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-5f0843c6d5e7485398aa9c877b800d01"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fd517aed0-5c11-4fcc-be85-c004f960caef%2FrId148.png?table=block&amp;id=5f0843c6-d5e7-4853-98aa-9c877b800d01" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><ul class="notion-list notion-list-disc notion-block-5c895d291c464528bcc343f8450ed1c5"><li>这里构建的都是在数据中加入一些标志啥的 ，但是如果要把红色汽车识别为鸟，要怎么构建分布式后门特征？</li></ul><ul class="notion-list notion-list-disc notion-block-d837c70a45024f53bf2f55034930a7c8"><li>后门攻击旨在误导经过训练的模型，以预测嵌入攻击者选择模式（即触发器）的任何输入数据上的目标标签τ。是同时在主要任务和后门任务上都有较好的训练结果，而不是影响模型的可用性。</li><ul class="notion-list notion-list-disc notion-block-d837c70a45024f53bf2f55034930a7c8"><li>转换为公式就是下面的：<!-- -->
​</li><ul class="notion-list notion-list-disc notion-block-b8206c24f2e144cca56cab6cb7d4c710"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-7fa3b7efaf1943818af38dfe60fc08fa"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F06c35971-fa0f-429f-b913-3aa0e89879d0%2FrId153.png?table=block&amp;id=7fa3b7ef-af19-4381-8af3-8dfe60fc08fa" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul><li><em>S</em><em>p</em><em>o</em><em>i</em><em>i</em>是毒化的数据，下标为cln的为正常的数据，未进行毒化。后门数据和正常数据没有交集，并且训练只用这两种数据集。</li><li>函数R将正常数据转换为所用的后门数据。参数φ代表触发器类型，φ = {TS, TG, TL}，具体参数有Size，Gap，Location<!-- -->
​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-ff689e044f06472aa05ef05ab6d67bae"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-80bae57f386b400e8e2df2338c45fa6c"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F16f29431-fc6a-4e2c-91a7-075f8529495a%2FrId158.png?table=block&amp;id=80bae57f-386b-400e-8e2d-f2338c45fa6c" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-529645761c5c409498ffa02468ab16ec" data-id="529645761c5c409498ffa02468ab16ec"><span><div id="529645761c5c409498ffa02468ab16ec" class="notion-header-anchor"></div><a class="notion-hash-link" href="#529645761c5c409498ffa02468ab16ec" title="DBA"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">DBA</span></span></h4><ul class="notion-list notion-list-disc notion-block-185612fc67c14bffa9d78255de666c0d"><li>在DBA攻击中，有M个攻击者，各有一个特征，每个攻击者在本地模型进行后门攻击。上面那个公式可以转换为：<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-185612fc67c14bffa9d78255de666c0d"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-4168d06ba6ad4849ab8931b3dfeacb94"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F5b5e6f0f-7939-4e8e-95a3-5417c79c7ff0%2FrId164.png?table=block&amp;id=4168d06b-a6ad-4849-ab89-31b3dfeacb94" alt="notion image" loading="lazy" decoding="async"/></div></figure><li>多了两个参数，分别代表毒化率和攻击间隔轮次。</li></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-5fd4f2df98e9485aaf33f62be4dcad8b" data-id="5fd4f2df98e9485aaf33f62be4dcad8b"><span><div id="5fd4f2df98e9485aaf33f62be4dcad8b" class="notion-header-anchor"></div><a class="notion-hash-link" href="#5fd4f2df98e9485aaf33f62be4dcad8b" title="Experiment"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Experiment</span></span></h4><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-95791e6a287d4493a80b728ecace261d" data-id="95791e6a287d4493a80b728ecace261d"><span><div id="95791e6a287d4493a80b728ecace261d" class="notion-header-anchor"></div><a class="notion-hash-link" href="#95791e6a287d4493a80b728ecace261d" title="Setup"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Setup</span></span></h4><ul class="notion-list notion-list-disc notion-block-dd2f5c881216444d855dc72f46b4fbbf"><li>基于四个分类数据集进行实验，数据为non-iid分布。</li></ul><ul class="notion-list notion-list-disc notion-block-f584552465ba4b7ab4b75a73b7abab62"><li>实验中数据集的描述和一些参数的信息<!-- -->
​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-f584552465ba4b7ab4b75a73b7abab62"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-c00ff1ebaa7b46789f7126407a05a587"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F973f1633-7544-469e-9ad8-9aa24c4cc6f0%2FrId172.png?table=block&amp;id=c00ff1eb-aa7b-4678-9f71-26407a05a587" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-82b29cd69fcd423e8dc50fd94e11561a" data-id="82b29cd69fcd423e8dc50fd94e11561a"><span><div id="82b29cd69fcd423e8dc50fd94e11561a" class="notion-header-anchor"></div><a class="notion-hash-link" href="#82b29cd69fcd423e8dc50fd94e11561a" title="DBA和之前的集中式攻击"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">DBA和之前的集中式攻击</span></span></h4><ul class="notion-list notion-list-disc notion-block-70df3462ed244b6ba51bccccc4404657"><li>Attack-M：多轮攻击，而且对于DBA攻击来说，一轮的一是所有分布式后门攻击都进行才算一轮，比如有四个特征，四个client，这里的一轮是4个client都进行了更新才算。</li></ul><ul class="notion-list notion-list-disc notion-block-0b2e24924bea4e7a9bfda75736122914"><li>Attack-S：DBA和集中式攻击在同一轮中完成了一个完整的后门。以MNIST为例，DBA攻击者分别在第12、14、16、18轮中嵌入其本地触发器，用于本地触发器1至4，而集中式攻击者在第18轮中植入其全局触发器。单轮和多轮都是一轮选择10个参与者进行更新。</li></ul><ul class="notion-list notion-list-disc notion-block-348a4a9da19f4291ac204af90a4c7a3d"><li>作者对上述两个攻击方式的分析角度不同，攻击A-M研究后门成功注入的容易程度，而攻击A-S研究后门效应减弱的速度。</li></ul><div class="notion-text notion-block-834a625856d04c6786ee31b5890c7881">​</div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-40b34c6796f24194aa2f96340f9e08b5"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F747a9592-358a-4156-afdb-ad7cd005c0d8%2FrId180.png?table=block&amp;id=40b34c67-96f2-4194-aa2f-96340f9e08b5" alt="notion image" loading="lazy" decoding="async"/></div></figure><div class="notion-text notion-block-6da48f56a224445fbf484fb53d95c560">​</div><ul class="notion-list notion-list-disc notion-block-5f59a852cbfc4974acd17a867d4d405c"><li>图四中的确DBA分布式的后门注入更有效，而且MNIST区别更明显。而且还发现全局触发器的攻击成功率高于任何本地触发器。DBA更快收敛，而且更持久</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-93e7f09e208c4b65a814e119d48a84ee" data-id="93e7f09e208c4b65a814e119d48a84ee"><span><div id="93e7f09e208c4b65a814e119d48a84ee" class="notion-header-anchor"></div><a class="notion-hash-link" href="#93e7f09e208c4b65a814e119d48a84ee" title="DBA的鲁棒性"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">DBA的鲁棒性</span></span></h4><ul class="notion-list notion-list-disc notion-block-8a41d17520b7477abcc9afc2a292554b"><li>RFA（Pillutla et al.，2019）和FoolsGold（Fung et al.，2018）是最近提出的两种基于距离或相似性度量的稳健FL聚合算法，RFA聚合用于更新的模型参数，并通过用近似几何中值替换聚合步骤中的加权算术平均值而对异常值表现出鲁棒性。</li></ul><ul class="notion-list notion-list-disc notion-block-5a1501f7dd4044a78730e500a67969b9"><li>此外，由于缩放操作更容易检测到攻击A-S（Pillutla等人，2019），我们将重点评估DBA和集中后门攻击在攻击A-M设置下对RFA和FoolsGold的攻击效果。</li></ul><ul class="notion-list notion-list-disc notion-block-c0f93417adb44e23923f8d188010eea7"><li>DBA攻击者提交的恶意更新在所有数据集中的距离都比集中式攻击者的更新要短，这有助于他们更好地绕过防御<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-c0f93417adb44e23923f8d188010eea7"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-f674f769de114250b52c9eddf8865a4d"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fe4877b0a-85d9-4845-aec8-a0d1930f695b%2FrId187.png?table=block&amp;id=f674f769-de11-4250-b52c-9eddf8865a4d" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-c2be043547f64137b12898b685442ff3" data-id="c2be043547f64137b12898b685442ff3"><span><div id="c2be043547f64137b12898b685442ff3" class="notion-header-anchor"></div><a class="notion-hash-link" href="#c2be043547f64137b12898b685442ff3" title="对触发器影响因素的分析"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">对触发器影响因素的分析</span></span></h4><ul class="notion-list notion-list-disc notion-block-f1f84bcc1fbd49078841b1f6272e654c"><li>Scale</li><ul class="notion-list notion-list-disc notion-block-f1f84bcc1fbd49078841b1f6272e654c"><li>扩大scale会增加后门特征的比例，增加攻击成功率。</li><li>模型越复杂，随着缩放增大，准确率相应减少。因为模型越复杂，缩放越大会忽视越多的其他特征。</li><li>更大规模的因子缓解了中央服务器对 DBA 的平均影响，这导致了更有影响力和更强的攻击性能，但也导致了全局模型在三个图像数据集的攻击轮次中下降的主要准确性。此外，使用大规模因子会导致异常更新与其他良性更新太不同，并且很容易根据参数的大小进行检测。因此，在选择比例因子方面存在权衡。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-4fd78fc5280645b486c3a42eb95317a5"><li>Location</li><ul class="notion-list notion-list-disc notion-block-4fd78fc5280645b486c3a42eb95317a5"><li>TRIGGER位置从左上角到中心再到右下角。经过实验可以看出触发器越往中心位置，攻击越不容易成功，而且越不持久。因为一般中心位置是整个图像的主体，主要特征容易掩盖后门特征。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-91914f4382314530a6a4c1287e32cfc4"><li>Gap</li><ul class="notion-list notion-list-disc notion-block-91914f4382314530a6a4c1287e32cfc4"><li>当局部触发器距离较大的时候，后面攻击成功率比较低，可能是局部卷积操作和局部触发器之间的大距离引起的，因此全局模型无法识别全局触发器。正如10中a图当gap是中间值的时候有个下降，然后有个回升，这是因为上个因素location位于中心附近了。</li><li>当使用0间隔触发器的时候，后门会遗忘的更快，可能是0间隔后就像一个大的触发器了，关于size后面会提到</li></ul></ul><ul class="notion-list notion-list-disc notion-block-22724eab3a7e4b7780da09ccdf96ee66"><li>Size</li><ul class="notion-list notion-list-disc notion-block-22724eab3a7e4b7780da09ccdf96ee66"><li>触发器的大小，较大的触发器会有较高的成功率，但是一定大的，成功率增加的就几乎不明显了。</li><li>太小的触发器size会让成功率大幅减小。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-c99cdc91c3d24e22a7fa6f1eb2fd6a81"><li>Interval</li><ul class="notion-list notion-list-disc notion-block-c99cdc91c3d24e22a7fa6f1eb2fd6a81"><li>对于LOAN和MNIST来说。后门攻击最好等全局模型一定收敛再进行。对于后两个数据集，round从1-50就比较不错，而且差别不大。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-f838fe7be8ed46fbb0324f8aff1118a3"><li>Poison ratio</li><ul class="notion-list notion-list-disc notion-block-f838fe7be8ed46fbb0324f8aff1118a3"><li>过大的毒比意味着攻击者放大了低精度局部模型的权重，这导致了主要任务中全局模型的失败。</li></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-cdf9384965ec4fd79afab1e6d50c4457" data-id="cdf9384965ec4fd79afab1e6d50c4457"><span><div id="cdf9384965ec4fd79afab1e6d50c4457" class="notion-header-anchor"></div><a class="notion-hash-link" href="#cdf9384965ec4fd79afab1e6d50c4457" title="Code of the paper"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Code of the paper</span></span></h4><div class="notion-text notion-block-3639f62da62149129180514d0631ce56"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/AI-secure/DBA">AI-secure/DBA: DBA: Distributed Backdoor Attacks against Federated Learning (ICLR 2020) (github.com)</a></div><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-a79c12226ccc4db09507c9caf6cc278a" data-id="a79c12226ccc4db09507c9caf6cc278a"><span><div id="a79c12226ccc4db09507c9caf6cc278a" class="notion-header-anchor"></div><a class="notion-hash-link" href="#a79c12226ccc4db09507c9caf6cc278a" title="How To Backdoor Federated Learning"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">How To Backdoor Federated Learning</span></span></h2><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-c9a20fe451c1401680cce18fb5eb5deb" data-id="c9a20fe451c1401680cce18fb5eb5deb"><span><div id="c9a20fe451c1401680cce18fb5eb5deb" class="notion-header-anchor"></div><a class="notion-hash-link" href="#c9a20fe451c1401680cce18fb5eb5deb" title="Summary"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Summary</span></span></h4><ul class="notion-list notion-list-disc notion-block-377bb65596164bbe992401530d7c1968"><li>针对联邦学习的场景，提出了模型替换的针对模型中毒的后门攻击。主要是在含有后门数据的数据集中训练后门模型，然后用大权重让后门模型在server聚合的时候有更多机会保留，从而在不影响主要任务的前提下。</li></ul><ul class="notion-list notion-list-disc notion-block-9e5c244af953491e9acff6d317963480"><li>通过实验进行了验证，针对比较简单CIFAR-10图像分类和单词预测，表现都比较好。而且只进行一轮攻击，后门模型也可以较好的保存（相较于之前），实验发现在训练后期进行后门注入的模型持续轮次越多。</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-3c513ddec6a946468474a53c56a1a93c" data-id="3c513ddec6a946468474a53c56a1a93c"><span><div id="3c513ddec6a946468474a53c56a1a93c" class="notion-header-anchor"></div><a class="notion-hash-link" href="#3c513ddec6a946468474a53c56a1a93c" title="Method"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Method</span></span></h4><ul class="notion-list notion-list-disc notion-block-55145644c6c7488a8df3ea959f6f4df4"><li>攻击背景</li><ul class="notion-list notion-list-disc notion-block-55145644c6c7488a8df3ea959f6f4df4"><li>FL会因为保护client的隐私而不会过多知道client的信息，这也就让恶意的client有比较大的操作空间。</li><li>恶意的client可以控制本地用于训练的数据和lr，epoch等，还可以对上传的更新模型进行一系列操作。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-251c3c918ef94088862a285a8fe9a077"><li>攻击目标</li><ul class="notion-list notion-list-disc notion-block-251c3c918ef94088862a285a8fe9a077"><li>本文的攻击方法目的在全局模型在主要任务上的准确度要高，同时在后门子任务上的准确率也要高。而传统的毒化攻击会改变大部分的输入空间的准确率。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-25d887eb0aee44298c8bc4cdd597e1d4"><li>语义后门</li><ul class="notion-list notion-list-disc notion-block-25d887eb0aee44298c8bc4cdd597e1d4"><li>语义后门可以导致有某种的特征的输入输出特定的label。</li><li>对于语义图像后门，攻击者可以选择其他数据中有的特征，也可以选择只有攻击者特有的特征。</li><li>之前的后门攻击研究了像素模式类型，这类攻击需要修改输入图像来满足特定的像素模式完成攻击。本文提出的模型替换可以引入语义后门和像素模式后门，但是本文主要研究攻击危害更大的基于语义的后门。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-c23a92f7bc2946dea57bacfd8df3fdbc"><li>构建攻击模型</li><ul class="notion-list notion-list-disc notion-block-c23a92f7bc2946dea57bacfd8df3fdbc"><li>Naive approach</li><ul class="notion-list notion-list-disc notion-block-f6b8d4326a2c44ea846b2709882f713a"><li>比较简单的方法就是在后门数据上训练模型，训练的时候应该包括后门数据和正常的数据。这种方法直接将更新应用于全局模型，引入后门。</li><li>这种简单的方法不适合联合学习。聚合抵消了后门模型的大部分贡献，联合模型很快就忘记了后门。攻击者需要经常选择，即使这样中毒也非常缓慢。在我们的实验中，我们使用简单的方法作为基线。</li></ul><li>模型替换</li><ul class="notion-list notion-list-disc notion-block-9c9e1992c44949a8b15166968c7811fa"><li>​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-1b33c828d5d44672ab6d10b8c704b921"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-70e7c6ce88b242138ba299961eeba528"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:417px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fa91ae221-be40-4f87-a467-d51e922bb721%2FrId228.png?table=block&amp;id=70e7c6ce-88b2-4213-8ba2-99961eeba528" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul><li>​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-c7f131874ea04e6594f856ca176608cc"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-91e190df66094bf796332cd4e0b22198"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F0caa33d2-a26f-40c8-aa4d-6baaa27e935d%2FrId232.png?table=block&amp;id=91e190df-6609-4bf7-9633-2cd4e0b22198" alt="notion image" loading="lazy" decoding="async"/></div></figure><li>其中<em>L</em>是攻击者提交的模型更新<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-f865ed30f63542a991dc3beec7b519b8"><div class="notion-text notion-block-52385fbaf55c452fb9d3ff8011464c3c"><em>t</em> + 1</div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-2525db2bd3e34c60a63f1863fc984c43"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F6d8b7263-0662-402e-92ba-6045aed9caee%2FrId235.png?table=block&amp;id=2525db2b-d3e3-4c60-a63f-1863fc984c43" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul><li>这里的X在恶意client端训练开始前初始化为下发的全局模型<em>G</em><em>t</em>，然后如算法2，对X在后门数据集上进行训练后再通过上面的式子产生本地端更新模型。</li><li>γ = $\frac{n}{\mu}$，不能确定这个缩放因子的可以每轮逐渐增大。并根据本地t+1模型在后门任务的精度来判断缩放因子。</li></ul><li>提高一轮攻击的持久性和避免server对模型的异常检测</li><ul class="notion-list notion-list-disc notion-block-c06b3a1987e540be8765caf9a48488a7"><li>将模型异常检测纳入损失函数，奖励准确性高的模型，惩罚偏离server任务“异常”的模型<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-7dc7d6169b0646c881182b7ef71316a8"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-125f2913e2944385bcf9e354077e255a"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:665px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Faf088482-4bfd-4b95-a41f-7c423e8580ae%2FrId242.png?table=block&amp;id=125f2913-e294-4385-bcf9-e354077e255a" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul><li>‍</li></ul></ul></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-1b38410521f74f4f9f75ad0d4681efd4" data-id="1b38410521f74f4f9f75ad0d4681efd4"><span><div id="1b38410521f74f4f9f75ad0d4681efd4" class="notion-header-anchor"></div><a class="notion-hash-link" href="#1b38410521f74f4f9f75ad0d4681efd4" title="Experiment"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Experiment</span></span></h4><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-68cfbd0204c042139dab43586b2e3d4d" data-id="68cfbd0204c042139dab43586b2e3d4d"><span><div id="68cfbd0204c042139dab43586b2e3d4d" class="notion-header-anchor"></div><a class="notion-hash-link" href="#68cfbd0204c042139dab43586b2e3d4d" title="ImageClassify"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">ImageClassify</span></span></h4><ul class="notion-list notion-list-disc notion-block-1c740377f2da43d2938667084e5682f9"><li>100个client，每次选10个，模型使用Resnet18。数据分割成non-IID .</li></ul><ul class="notion-list notion-list-disc notion-block-307b3ea2a8a549df9f7faaeceaa70d66"><li>我们选择了三个特征作为后门：绿色汽车、带有赛车条纹的汽车和背景中带有垂直条纹墙的汽车。<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-307b3ea2a8a549df9f7faaeceaa70d66"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-3344c2519240434cbb0ba95ada16c69d"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F2188f6f3-7e56-4c1a-a12a-d087753bcf03%2FrId252.png?table=block&amp;id=3344c251-9240-434c-bb0b-a95ada16c69d" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><ul class="notion-list notion-list-disc notion-block-8476095666b24f628057654e2d8369ef"><li>训练的时候攻击者就要包含后门数据和普通数据，这样可以保证在主要任务上的准确性。</li><ul class="notion-list notion-list-disc notion-block-8476095666b24f628057654e2d8369ef"><div class="notion-text notion-block-81346b35319540e4a2d6d397c0197311">参与者的训练数据非常多样，后门图像仅代表一小部分，因此引入后门对联合模型的主要任务准确性几乎没有影响。</div></ul></ul><ul class="notion-list notion-list-disc notion-block-f3a96da4dd0a47af9fc91f3493316a64"><li>这里的恶意模型数据集是640张正常的图像，加上上面每个特征的后门图像拿出了3张验证之后所有的用于训练。</li></ul><ul class="notion-list notion-list-disc notion-block-e40ddec02caa4df19d7e4a0071b9edb6"><li>实验结果</li><ul class="notion-list notion-list-disc notion-block-e40ddec02caa4df19d7e4a0071b9edb6"><li>​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-a9dc16934a1d499ca23a009ff8647f1b"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-22679fc253f74e83af7c0cd66a217b05"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F0693b6f5-001b-4d51-8b56-530edc3c44ea%2FrId258.png?table=block&amp;id=22679fc2-53f7-4e83-af7c-0cd66a217b05" alt="notion image" loading="lazy" decoding="async"/></div></figure><li>可以看出单轮攻击后会出现后门准确值的衰减。</li><li>这里后门攻击效果，条纹墙要好于绿色车，可能是因为绿色车更贴近良性数据。</li><ul class="notion-list notion-list-disc notion-block-2d5c55f82d33431391d80bd221b6ddf4"><div class="notion-text notion-block-d9b68e36818143f184496acb6c086963">在单词预测中一些不常见的单词组会与driving Jeep也容易被遗忘。</div></ul><li>而右图是多个client，相当于会重复攻击，实验中有10个左右client就可以达到不错的效果。</li></ul><li>​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-e1fa5d4c459f4d45b3f78520ec69cf57"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-a5d1d82aabb847b18e473c42da0c2820"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fe44cb63b-c3bc-413a-b718-2cb123f86232%2FrId265.png?table=block&amp;id=a5d1d82a-abb8-47b1-8e47-3c42da0c2820" alt="notion image" loading="lazy" decoding="async"/></div></figure><li>这个实验是针对在多少轮进行攻击的效果，可以看出越往后模型接近收敛的时候实施攻击，可以保留的轮次越多，效果越好。</li></ul></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-49cbed1629e4415f9e2c040071ad5bd5" data-id="49cbed1629e4415f9e2c040071ad5bd5"><span><div id="49cbed1629e4415f9e2c040071ad5bd5" class="notion-header-anchor"></div><a class="notion-hash-link" href="#49cbed1629e4415f9e2c040071ad5bd5" title="Word-prediction"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Word-prediction</span></span></h4><ul class="notion-list notion-list-disc notion-block-e1f146ef162d42db92135c40a40eeeda"><li>​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-e1f146ef162d42db92135c40a40eeeda"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-05e7fbc44cdb48e3858eb704160eef0b"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fbac056ac-e1de-4c7d-a286-759dba8a5190%2FrId272.png?table=block&amp;id=05e7fbc4-4cdb-48e3-858e-b704160eef0b" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><ul class="notion-list notion-list-disc notion-block-80582eea20cc4d3a8eddeb18a29bc0c4"><li>和图像分类一样，不同的后门效果有所不同，什么贝叶斯属于中，预测词属于比较流行或者不流行的两个极端相比需要更少的更新范数。</li></ul><ul class="notion-list notion-list-disc notion-block-0c96d04ffec6410d983ab7ea55afc10d"><li>实验中，较小的γ会让主要任务有比较高的准确率，而且较大的γ不会对全局模型准确性有较大的损失，所以攻击者选取γ的余地挺多。</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-e7a99bf0a8324f319d6b709189f5002f" data-id="e7a99bf0a8324f319d6b709189f5002f"><span><div id="e7a99bf0a8324f319d6b709189f5002f" class="notion-header-anchor"></div><a class="notion-hash-link" href="#e7a99bf0a8324f319d6b709189f5002f" title="Defence"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Defence</span></span></h4><ul class="notion-list notion-list-disc notion-block-c623719dc17e4a5e91adf4243e167029"><li>对模型进行加密聚合，虽然保护了模型的机密性，但是也让对模型的异常检测变得困难。</li></ul><ul class="notion-list notion-list-disc notion-block-3ae643b4d7c5460899c184b337d7a4ae"><li>拜占庭式容忍聚合机制可以减轻后门攻击，代价是丢弃许多良性参与者的模型更新，即使在没有攻击的情况下，也会显著降低生成的模型的准确性，并侵犯训练数据的隐私。</li></ul><ul class="notion-list notion-list-disc notion-block-dbcb05543ded4b77a5858062f5c012de"><li>参与者级别的差异隐私可以降低后门攻击的有效性，但只能以降低模型在主要任务上的性能为代价。</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-3347bb2e37af4b36abfad9091eded0ee" data-id="3347bb2e37af4b36abfad9091eded0ee"><span><div id="3347bb2e37af4b36abfad9091eded0ee" class="notion-header-anchor"></div><a class="notion-hash-link" href="#3347bb2e37af4b36abfad9091eded0ee" title="Related WORK"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Related WORK</span></span></h4><ul class="notion-list notion-list-disc notion-block-c4c1478c11a0459b9d3a1d7948dceb0b"><li>之前针对机器学习的攻击主要利用数据中毒或者直接插入后门组成改变模型。但是对于有大量client的FL来说，没有攻击效果，大量好的模型会抵消中毒特征。</li></ul><ul class="notion-list notion-list-disc notion-block-88670a033a5f406aa32a0985c093d849"><li>传统的防御就是剪枝或者检查数据的异常值，过滤什么的，但是需要检查者获取数据或者真实的模型，所以不能用于FL。</li></ul><ul class="notion-list notion-list-disc notion-block-990c8c147a21457c9a00723a14274878"><li>安全多方计算无法保护模型完整性，就是可以保护模型不被外人得到，但是无法保护模型有问题。安全聚合也能保护机密性，但是对于模型中毒也无法方法，反而使其难以检测。</li></ul><ul class="notion-list notion-list-disc notion-block-f5afbb7b8bd34bd9ac0ddb36233e0c6f"><li>还将了defence中关于拜占庭容错分布式学习和参与者级别差分隐私的一些内容。</li></ul><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-3e390223bb09499c820b99baeb326c8c" data-id="3e390223bb09499c820b99baeb326c8c"><span><div id="3e390223bb09499c820b99baeb326c8c" class="notion-header-anchor"></div><a class="notion-hash-link" href="#3e390223bb09499c820b99baeb326c8c" title="Communication-Efficient Learning of Deep Networks from Decentralized Data"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Communication-Efficient Learning of Deep Networks from Decentralized Data</span></span></h2><div class="notion-text notion-block-7522fca477a9426b82b199893d92aa18">international conference on artificial intelligence and statistics</div><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-69ecd4dfe535413dad0363611198be68" data-id="69ecd4dfe535413dad0363611198be68"><span><div id="69ecd4dfe535413dad0363611198be68" class="notion-header-anchor"></div><a class="notion-hash-link" href="#69ecd4dfe535413dad0363611198be68" title="Summary"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Summary</span></span></h4><ul class="notion-list notion-list-disc notion-block-b2c8510fbda346b49745a7c4504ca99f"><li>当前机器学习模型训练中存在着数据隐私保护问题，所以作者提出了FL概念。通过分布式+隐私保护进行训练模型。对不平衡、non-IID的数据也更合适。</li></ul><ul class="notion-list notion-list-disc notion-block-b8e0313bf1fd4e4ba74c698fc00e2980"><li>主要提出了FedSGD和FedAvg算法。FedAvg通信代价要小于FedSGD</li></ul><ul class="notion-list notion-list-disc notion-block-6779c9d2992c4d6e941261cfda714581"><li>实验中使用了MNIST数字识别和语言模型，通过改变E和B(client本地训练的轮次、每次训练小批量数据的size)来探究最优的E和B，而且在non-IID和IID两种数据中都进行了相关实验。</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-aac80843ac4a42999e2684df161b2919" data-id="aac80843ac4a42999e2684df161b2919"><span><div id="aac80843ac4a42999e2684df161b2919" class="notion-header-anchor"></div><a class="notion-hash-link" href="#aac80843ac4a42999e2684df161b2919" title="Problem and Background"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Problem and Background</span></span></h4><ul class="notion-list notion-list-disc notion-block-dc7afea9c5114d338fb80d21528a73f9"><li>目前我们拥有的一些设备（手机、PC）具有一定的算力，也可以接触到大量的包含个人隐私的数据。而且这些数据相较于传统的分布式学习是unbalanced and non-IID。每个client的数据肯定是不同规模，不同倾向的。</li></ul><ul class="notion-list notion-list-disc notion-block-9f34088409dc4ba6a207f4e5eb2cabbe"><li>所以我们希望提出一种模型训练的方法，可以在不同的设备上利用本地隐私数据进行模型训练。</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-7a5eebc323104a50afe26d76967651b7" data-id="7a5eebc323104a50afe26d76967651b7"><span><div id="7a5eebc323104a50afe26d76967651b7" class="notion-header-anchor"></div><a class="notion-hash-link" href="#7a5eebc323104a50afe26d76967651b7" title="Contributions"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Contributions</span></span></h4><ul class="notion-list notion-list-disc notion-block-059225fbfd4b473985fe0cd1d7f88f9b"><li>提出Federal Learning这一概念。可用于人工智能隐私保护和安全多方计算。</li></ul><ul class="notion-list notion-list-disc notion-block-107a283a097f4ed68c04106b3b49cb0b"><li>FedAvg算法有一定的实用性，用于unbalanced and non-IID dataset，可以用较少的通信轮次获取不错的训练模型。</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-c1c4e9a4be54402698503da72368d563" data-id="c1c4e9a4be54402698503da72368d563"><span><div id="c1c4e9a4be54402698503da72368d563" class="notion-header-anchor"></div><a class="notion-hash-link" href="#c1c4e9a4be54402698503da72368d563" title="Method"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Method</span></span></h4><ul class="notion-list notion-list-disc notion-block-f6b1d7170d8c4b828d97715d6d8c4b18"><li>FedSGD and FedAvg</li><ul class="notion-list notion-list-disc notion-block-f6b1d7170d8c4b828d97715d6d8c4b18"><li>C-fraction of clients on each round</li><li>E-本地客户端训练local data的次数</li><li>B-训练时本地数据小批量的大小，就是每次训练用多少数据。当B无穷大就代表每次训练本地的所有数据都要用。</li><li>当B无穷大，E为1的时候，就是用所有数据每轮训练一次，那就是FedSGD Algorithm。</li><li>​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-6d17f242609b4221b9971b73491675a1"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-e6fe777118b442c2bc922545cfb823c2"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fa1196817-3002-4ef6-b96a-5bdf55052abf%2FrId304.png?table=block&amp;id=e6fe7771-18b4-42c2-bc92-2545cfb823c2" alt="notion image" loading="lazy" decoding="async"/></div></figure><li>fedsgd：client发送给server的是g（梯度）</li><li>fedavg：client直接用g求出要更新的ω发送给server。</li></ul></ul></ul><ul class="notion-list notion-list-disc notion-block-0a1fd416e3e6490aba5ba4304a3cfe9d"><li>algorithm 1 FedAvg</li><ul class="notion-list notion-list-disc notion-block-0a1fd416e3e6490aba5ba4304a3cfe9d"><li>​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-2319bd759fb049399581d85939d7f58a"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-92a9c7d9dc244ae0bf2344bf3df9ab5e"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fad32e553-e813-4fa1-a6b9-1c0905c05917%2FrId311.png?table=block&amp;id=92a9c7d9-dc24-4ae0-bf23-44bf3df9ab5e" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul><li>其中服务器平均更新的权重是每个client的数据集占总的数据集的比重。（但是这里有client谎报自己的数据集怎么办，让自己用比较少的dataset占大的比重）</li></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-4f410ad2ca6b4e2596d0fc0e67726904" data-id="4f410ad2ca6b4e2596d0fc0e67726904"><span><div id="4f410ad2ca6b4e2596d0fc0e67726904" class="notion-header-anchor"></div><a class="notion-hash-link" href="#4f410ad2ca6b4e2596d0fc0e67726904" title="Experimental"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Experimental</span></span></h4><ul class="notion-list notion-list-disc notion-block-ed5b3e7bc19b490d83a88d74f1105e78"><li>MNIST digit recognition</li><ul class="notion-list notion-list-disc notion-block-ed5b3e7bc19b490d83a88d74f1105e78"><li>MNIST 2NN</li><ul class="notion-list notion-list-disc notion-block-1b4efd3cfea242d0ac8e3d47dc702615"><li>具有2个隐藏层的多层感知器，每个层使用****ReLu 激活 200 个单元(总计 199,210 个参数)</li></ul><li>CNN</li><ul class="notion-list notion-list-disc notion-block-a0af6da12026482e8414b3d5eb836049"><li>CNN有两个5x5卷积层(第一个有32个通道，第二个有64个，每个都有2x2个最大池化)，一个有512个单元和****ReLu 激活的完全连接层，以及一个最终的 Softmax 输出层(总参数为 1,663,370)。</li></ul><li>两种数据分发方式：IID and non-IID</li><ul class="notion-list notion-list-disc notion-block-25bd1f69f53d46769ce241a363437a14"><li>IID Each client随机获得600样例。</li><li>non-IID 将用例按数字先分好类，再分成300size的200份。最后每个client只有两个数字的样例。</li></ul></ul></ul><ul class="notion-list notion-list-disc notion-block-0fb2e1652ffd47ad938537be36f515bc"><li>Language Modeling</li><ul class="notion-list notion-list-disc notion-block-0fb2e1652ffd47ad938537be36f515bc"><li>dataset:莎士比亚作品集</li><li>为作品集中所有说话的角色建立一个client，共1146个。将每个client的数据集八二分，80用于训练，20用于测试。与MNIST数据集不同的是这个每个client的数据是不平衡的，就是训练可用数据多少不同。</li><li>该模型将一系列字符作为输入，并将每个字符嵌入到学习的 8 维空间中。然后通过 2 个 LSTM 层处理嵌入的字符，每个层有 256 个节点。最后，第二个 LSTM 层的输出被发送到 softmax 输出层，每个字符一个节点。完整模型有 866,578 个参数，我们使用 80 个字符的展开长度进行训练。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-ff49208aea4347c9815aafa5579d4194"><li>Result</li><ul class="notion-list notion-list-disc notion-block-ff49208aea4347c9815aafa5579d4194"><li>论文中讲解了相关实验的结果，主要研究了当B和E变化时对实验的影响，当B为10，E为5的时候效果比较好，因为B正无穷，E为1的时候就成立FedSGD。</li></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-f9f97bdb2f7c452a8f9eac64164d64da" data-id="f9f97bdb2f7c452a8f9eac64164d64da"><span><div id="f9f97bdb2f7c452a8f9eac64164d64da" class="notion-header-anchor"></div><a class="notion-hash-link" href="#f9f97bdb2f7c452a8f9eac64164d64da" title="Related Knowledge"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Related Knowledge</span></span></h4><ul class="notion-list notion-list-disc notion-block-e550f745d41240eeaea795fca813bfcc"><li>SGD</li><ul class="notion-list notion-list-disc notion-block-e550f745d41240eeaea795fca813bfcc"><li>利用梯度觉得接下来迭代的方向，使代价函数越来越小。</li><li>SGD为随机梯度下降法。用数学式可以将 SGD 写成如下的式（6.1）。<!-- -->
​</li><ul class="notion-list notion-list-disc notion-block-492da172713442228d33c78ab996a92b"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-3344aab26f3946d9bbd761d3e94447ed"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:268px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Feb0369de-1519-4550-a99b-e7593dc27308%2FrId334.gif?table=block&amp;id=3344aab2-6f39-46d9-bbd7-61d3e94447ed" alt="notion image" loading="lazy" decoding="async"/></div></figure><div class="notion-text notion-block-7ba1deb225a0426ea6ea829020aa6c00">这里把需要更新的权重参数记为W，把损失函数关于W的梯度记为 ∂L/∂W 。η ηη 表示学习率，实际上会取 0.01 或 0.001 这些事先决定好的值。式子中的←表示用右边的值更新左边的值。 ———————————————— 版权声明：本文为CSDN博主「赵孝正」的原创文章，遵循CC 4.0 BY-SA版权协议，转载请附上原文出处链接及本声明。 原文链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blog.csdn.net/weixin_46713695/article/details/123198293">https://blog.csdn.net/weixin_46713695/article/details/123198293</a></div></ul><li><code class="notion-inline-code">​
import matplotlib.pyplot as plt
#一元四次函数 f(x) = x^4 - 3x^3+2
xold = 0
xnew = 6
​
#误差
eps = 0.00002
#步长，学习率
alpha = 0.01
#function
x = []
y = []
plt.title=&quot;TrendofSGD&quot;
plt.xlabel(&#x27;x&#x27;)
plt.ylabel(&#x27;y&#x27;)
​
def f(x): return x ** 4 - 3 * x ** 3 + 2
def f_prime(x):#倒数 return 4 * x ** 3 - 9 * x ** 2
while abs(f(xold)-f(xnew)) &gt; eps: xold = xnew xnew = xold - alpha * f_prime(xold) x.append(xnew) y.append(f(xnew)) plt.scatter(xnew,f(xnew)) plt.pause(0.2)
plt.plot(x,y)
plt.show()
print(x)
print(&quot;Result:&quot;,xnew,f(xnew))</code></li></ul></ul><ul class="notion-list notion-list-disc notion-block-bf4f855b3a4a4f4b837483fd6d84539c"><li>卷积层</li><ul class="notion-list notion-list-disc notion-block-bf4f855b3a4a4f4b837483fd6d84539c"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.zhihu.com/question/22298352/answer/228543288">(10 条消息) 如何通俗易懂地解释卷积？ - 知乎 (zhihu.com)</a></li><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blog.csdn.net/qq_31820761/article/details/102588339">(27条消息) 卷积层详述Zeus_dad的博客-CSDN博客卷积层</a></li></ul></ul><ul class="notion-list notion-list-disc notion-block-b505ac6c6de94c4887d47d8618fe1ff0"><li>池化</li><ul class="notion-list notion-list-disc notion-block-b505ac6c6de94c4887d47d8618fe1ff0"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.jiqizhixin.com/graph/technologies/0a4cedf0-0ee0-4406-946e-2877950da91d">池化 | 机器之心 (jiqizhixin.com)</a></li></ul></ul><ul class="notion-list notion-list-disc notion-block-637c516326e348b9be287c8ebe5e8f7f"><li>完全连接层</li><ul class="notion-list notion-list-disc notion-block-637c516326e348b9be287c8ebe5e8f7f"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://zhuanlan.zhihu.com/p/33841176">CNN 入门讲解：什么是全连接层（Fully Connected Layer）? - 知乎 (zhihu.com)</a></li></ul></ul><ul class="notion-list notion-list-disc notion-block-1366150f234e4fb6a2f3889cb902422d"><li>ReLu activation</li><ul class="notion-list notion-list-disc notion-block-1366150f234e4fb6a2f3889cb902422d"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://zhuanlan.zhihu.com/p/427473958">原来ReLU这么好用！一文带你深度了解ReLU激活函数！ - 知乎 (zhihu.com)</a></li><li>考虑输入输出及数据变化来选择用什么激活函数</li></ul></ul><ul class="notion-list notion-list-disc notion-block-761f1cde74ec48f5ba4c7ea9ff4af5e6"><li>*softmax output **</li><ul class="notion-list notion-list-disc notion-block-761f1cde74ec48f5ba4c7ea9ff4af5e6"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blog.csdn.net/u014541881/article/details/124671518">(27条消息) 神经网络中的softmax层为何可以解决分类问题——神经网络之softmax(3)石头1666的博客-CSDN博客softmax分类层</a></li></ul></ul><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-172156b378754dd591eb91034e7d8765" data-id="172156b378754dd591eb91034e7d8765"><span><div id="172156b378754dd591eb91034e7d8765" class="notion-header-anchor"></div><a class="notion-hash-link" href="#172156b378754dd591eb91034e7d8765" title="Federated Optimization in Heterogeneous Networks"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Federated Optimization in Heterogeneous Networks</span></span></h2><ul class="notion-list notion-list-disc notion-block-19aa364ac0c7418e996416d3160cfe5b"><li>MLSys（机器学习系统会议）、Tian Li（cmu）</li></ul><ul class="notion-list notion-list-disc notion-block-7675c5c8fac547929ca031b500bfa8ed"><li>异构网络中的联邦优化</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-42b91ae0fc714a9391131b97b3c9b0d9" data-id="42b91ae0fc714a9391131b97b3c9b0d9"><span><div id="42b91ae0fc714a9391131b97b3c9b0d9" class="notion-header-anchor"></div><a class="notion-hash-link" href="#42b91ae0fc714a9391131b97b3c9b0d9" title="Summary"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Summary</span></span></h4><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-f871022c995b49a39f4dca41e2cc6731" data-id="f871022c995b49a39f4dca41e2cc6731"><span><div id="f871022c995b49a39f4dca41e2cc6731" class="notion-header-anchor"></div><a class="notion-hash-link" href="#f871022c995b49a39f4dca41e2cc6731" title="Method"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Method</span></span></h4><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-8d69671c5a4b4784a4c255dc789d8c9a" data-id="8d69671c5a4b4784a4c255dc789d8c9a"><span><div id="8d69671c5a4b4784a4c255dc789d8c9a" class="notion-header-anchor"></div><a class="notion-hash-link" href="#8d69671c5a4b4784a4c255dc789d8c9a" title="FedAvg（谷歌）"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">FedAvg（谷歌）</span></span></h4><ul class="notion-list notion-list-disc notion-block-c691d04561704ad58224af06c822620d"><li>​<!-- -->​</li><ul class="notion-list notion-list-disc notion-block-c691d04561704ad58224af06c822620d"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-82a1312547a343a298ce48662a576e64"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F77491a17-f0b9-4b88-83b1-dbde006ccc4c%2FrId364.png?table=block&amp;id=82a13125-47a3-43a2-98ce-48662a576e64" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-5faf9bd93ad74ea2a89015d82242604e" data-id="5faf9bd93ad74ea2a89015d82242604e"><span><div id="5faf9bd93ad74ea2a89015d82242604e" class="notion-header-anchor"></div><a class="notion-hash-link" href="#5faf9bd93ad74ea2a89015d82242604e" title="FedProx（本文）"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">FedProx（本文）</span></span></h4><div class="notion-text notion-block-dc29108999c0419584339b6b206fbe7a">​</div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-17e30064696b43308a9c4bc1b0955bd9"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:700px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fa68d1b5f-c696-4648-9cbc-e74fc4b83b0d%2FrId369.png?table=block&amp;id=17e30064-696b-4330-8a9c-4bc1b0955bd9" alt="notion image" loading="lazy" decoding="async"/></div></figure><div class="notion-text notion-block-9500fc1468d349b09470bc8353384dc7">​</div><ul class="notion-list notion-list-disc notion-block-8bad826bf4b543318c20fc6aae8d6767"><li>FedProx多的参数为μ，γ，<em>w</em>0<!-- -->,但是少的是local epoch E。</li></ul><ul class="notion-list notion-list-disc notion-block-5effeb90013c492e8c10435128793a0e"><li>​​</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-0b1cf69278274b7787c16875ef64edb2" data-id="0b1cf69278274b7787c16875ef64edb2"><span><div id="0b1cf69278274b7787c16875ef64edb2" class="notion-header-anchor"></div><a class="notion-hash-link" href="#0b1cf69278274b7787c16875ef64edb2" title="Experiment"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Experiment</span></span></h4><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-9219a7da8b974cc3a423d763eeedcb01" data-id="9219a7da8b974cc3a423d763eeedcb01"><span><div id="9219a7da8b974cc3a423d763eeedcb01" class="notion-header-anchor"></div><a class="notion-hash-link" href="#9219a7da8b974cc3a423d763eeedcb01" title="Related Work"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Related Work</span></span></h4><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-f95563f2bfc14a7c9f5ec662949b4679" data-id="f95563f2bfc14a7c9f5ec662949b4679"><span><div id="f95563f2bfc14a7c9f5ec662949b4679" class="notion-header-anchor"></div><a class="notion-hash-link" href="#f95563f2bfc14a7c9f5ec662949b4679" title="FedAvg"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">FedAvg</span></span></h4><ul class="notion-list notion-list-disc notion-block-edf0268a1c2d42738f9c1d63bbfd6fe6"><li>FedAvg是一种基于原始数据的平均局部随机梯度下降进行更新的方法。</li><ul class="notion-list notion-list-disc notion-block-edf0268a1c2d42738f9c1d63bbfd6fe6"><li>但是因为异构性，它在现实中使用有一定的挑战。</li><li>最近也有工作在非FL环境中分析FedAvg，比如在IID数据中进行FedAvg的更新，但其结果依赖于同一迭代中相同的求解器（比如都是SGD）这一假设。</li></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-4e78ffa452544ae89bada918c347c301" data-id="4e78ffa452544ae89bada918c347c301"><span><div id="4e78ffa452544ae89bada918c347c301" class="notion-header-anchor"></div><a class="notion-hash-link" href="#4e78ffa452544ae89bada918c347c301" title="统计异构"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">统计异构</span></span></h4><ul class="notion-list notion-list-disc notion-block-e1826cc84f994bd485a0649c73f59b18"><li>都有一定的假设限制，而且有的方法无法解决非凸问题。</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-95eafc77ae014ec8a3a64469d506aee2" data-id="95eafc77ae014ec8a3a64469d506aee2"><span><div id="95eafc77ae014ec8a3a64469d506aee2" class="notion-header-anchor"></div><a class="notion-hash-link" href="#95eafc77ae014ec8a3a64469d506aee2" title="系统异构"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">系统异构</span></span></h4><ul class="notion-list notion-list-disc notion-block-ff892772ae20431eae520933adbcba44"><li>设备网络、存储、通信能力、计算能力都不同，这就是系统异构。</li></ul><ul class="notion-list notion-list-disc notion-block-7ab6b0c4587347be9c60090fe94551db"><li>如果简单放弃某些“掉队者”，则会影响收敛，也有有一定的数据偏差。</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-e7bd1cb8affa45aeb80588853f47db30" data-id="e7bd1cb8affa45aeb80588853f47db30"><span><div id="e7bd1cb8affa45aeb80588853f47db30" class="notion-header-anchor"></div><a class="notion-hash-link" href="#e7bd1cb8affa45aeb80588853f47db30" title="FedProc"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">FedProc</span></span></h4><ul class="notion-list notion-list-disc notion-block-4e8786ad221e4d4bbbf8b36f197c6b0b"><li>在异构环境中可用，同时保持FedAvg原有的隐私和计算优势</li></ul><ul class="notion-list notion-list-disc notion-block-89028a61c9184625a6350040eaa14e62"><li>分析了方法的收敛性</li></ul><ul class="notion-list notion-list-disc notion-block-3e0d5857dbb1456d856374f409dc9c65"><li>对统计异构的处理方法，受到求解线性方程组的随机化Kaczmarz方法的启发，该方法的类似假设已被用于分析其他情况下的SGD变种；</li></ul><ul class="notion-list notion-list-disc notion-block-a134bdad00c344b0924717596f7ba2b0"><li>所提框架在异构联邦网络中有更好的鲁棒性和稳定性。</li></ul><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-690c38c4b730489fb9e605803eb4611a" data-id="690c38c4b730489fb9e605803eb4611a"><span><div id="690c38c4b730489fb9e605803eb4611a" class="notion-header-anchor"></div><a class="notion-hash-link" href="#690c38c4b730489fb9e605803eb4611a" title="Untitled"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Untitled</span></span></h2><div class="notion-text notion-block-f74b01e444b040b5baff12e7d4a8b584">‍</div></main></div>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Data preprocess]]></title>
            <link>https://notion-next-blue-seven.vercel.app/article/18af16c9-0a9e-4470-b1a7-cd73f6894dbf</link>
            <guid>https://notion-next-blue-seven.vercel.app/article/18af16c9-0a9e-4470-b1a7-cd73f6894dbf</guid>
            <pubDate>Fri, 02 Jul 2021 00:00:00 GMT</pubDate>
            <content:encoded><![CDATA[<div id="container" class="mx-auto undefined"><main class="notion light-mode notion-page notion-block-18af16c90a9e4470b1a7cd73f6894dbf"><div class="notion-viewport"></div><div class="notion-collection-page-properties"></div><blockquote class="notion-quote notion-block-8ca9d20ae8164011b9399b8ef1f8b84f"><div>文章来源说明</div></blockquote><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-f34cfc804c1f406a95c0ff6e166f3503"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:100%;max-width:100%;flex-direction:column;height:100%"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F58653f9f-d827-4ae8-845e-0ea88a47549e%2FUntitled.png?table=block&amp;id=f34cfc80-4c1f-406a-95c0-ff6e166f3503" alt="notion image" loading="lazy" decoding="async"/></div></figure><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-c61587d08f2b440e89d8e3465ee17b92"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:100%;max-width:100%;flex-direction:column;height:100%"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F4c1bcf49-cd9d-4755-a7db-597bcd02fe68%2FUntitled.png?table=block&amp;id=c61587d0-8f2b-440e-89d8-e3465ee17b92" alt="notion image" loading="lazy" decoding="async"/></div></figure><h3 class="notion-h notion-h2 notion-h-indent-0 notion-block-80fa372f17a742fea4d3484e27c33c4a" data-id="80fa372f17a742fea4d3484e27c33c4a"><span><div id="80fa372f17a742fea4d3484e27c33c4a" class="notion-header-anchor"></div><a class="notion-hash-link" href="#80fa372f17a742fea4d3484e27c33c4a" title="Session Construction"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Session Construction</span></span></h3><ul class="notion-list notion-list-disc notion-block-7e4f0c25471048f5a207ce139632de35"><li>TCP、UDP 和 ICMP 数据包首先分别用于构建会话。 TCP、UDP和ICMP会话分别由五元组定义，称为会话ID，可以标识唯一的会话。会话ID与记录一一对应。具体来说，TCP会话ID和UDP会话ID一样，由<b>协议类型、IP源地址、IP目的地址、源端口和目的端口</b>组成。同样，ICMP会话ID由<b>协议类型、IP源地址、IP目的地址、ICMP类型和ICMP代码</b>组成。每个会话内应用层的有效负载是单独加入。</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-f9ce7ed564344aed8da0db5a845c5489" data-id="f9ce7ed564344aed8da0db5a845c5489"><span><div id="f9ce7ed564344aed8da0db5a845c5489" class="notion-header-anchor"></div><a class="notion-hash-link" href="#f9ce7ed564344aed8da0db5a845c5489" title="Record Construction"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Record Construction</span></span></h4><ul class="notion-list notion-list-disc notion-block-46c145e64ead499f9de5c33bb8d21293"><li>特征的维度由上表可知为1000维，前十七个位置保留用于数据包header的特征。剩下的983个为payload。少于1000补0，前17维没有的也用0填充，多余的payload切掉。</li></ul><ul class="notion-list notion-list-disc notion-block-8cbf970300e944eb926c7c6c8d82f81a"><li>协议类型表示为100 010 001，TCP UDP ICMP，占前三个位置</li></ul><ul class="notion-list notion-list-disc notion-block-adb42bd8f3304a04b766a7c4d39e623e"><li>消除源IP和目的IP的影响。</li></ul><ul class="notion-list notion-list-disc notion-block-ebce6f7619284e079c64959379ca604d"><li>interval_mean、interval_varience：计算会话中数据包的时间间隔，并将其均值和方差值作为记录的时间特征。</li></ul><ul class="notion-list notion-list-disc notion-block-9ca995ae003141e4ba4f4113231f6ccb"><li>tcp_flag中包含：FIN, SYN, RST, PSH, ACK, URG, ECE, CWR。这是会话的特征，所以比如此会话中有10个SYN标志位1，则Record数据集中此Session的Record SYN = 10。</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-4d69125c4f034ef78ba2ac90114ae2b6" data-id="4d69125c4f034ef78ba2ac90114ae2b6"><span><div id="4d69125c4f034ef78ba2ac90114ae2b6" class="notion-header-anchor"></div><a class="notion-hash-link" href="#4d69125c4f034ef78ba2ac90114ae2b6" title="Normalization"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Normalization</span></span></h4><ul class="notion-list notion-list-disc notion-block-503bc56964f444409015f1054499c85d"><li>payload中应该都是十六进制字符，所以为0-255之间，将其除255转换到0-1之间。</li></ul><ul class="notion-list notion-list-disc notion-block-70ead775857c4342936bdca85d07fb36"><li>对于一些我们无法知道范围的特征，这里用最小最大标准化进行。</li></ul><ul class="notion-list notion-list-disc notion-block-10e90384dce84b3e945e7ed39a4324e6"></ul></main></div>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Trojaning Attack on Neural Networks]]></title>
            <link>https://notion-next-blue-seven.vercel.app/article/aa342b33-94c3-4e09-85d5-853c9b4159f2</link>
            <guid>https://notion-next-blue-seven.vercel.app/article/aa342b33-94c3-4e09-85d5-853c9b4159f2</guid>
            <pubDate>Sun, 11 Jun 2023 00:00:00 GMT</pubDate>
            <description><![CDATA[针对神经网络的后门攻击]]></description>
            <content:encoded><![CDATA[<div id="container" class="mx-auto undefined"><main class="notion light-mode notion-page notion-block-aa342b3394c34e0985d5853c9b4159f2"><div class="notion-viewport"></div><div class="notion-collection-page-properties"></div><blockquote class="notion-quote notion-block-804b20c2a36f41ef8b022050c6352525"><div>NDSS 2018</div></blockquote><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-9ccfb1006b0b426392566ff64ace2fba" data-id="9ccfb1006b0b426392566ff64ace2fba"><span><div id="9ccfb1006b0b426392566ff64ace2fba" class="notion-header-anchor"></div><a class="notion-hash-link" href="#9ccfb1006b0b426392566ff64ace2fba" title="Summary"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Summary</span></span></h2><ul class="notion-list notion-list-disc notion-block-c20698b0a8df4f9899305b7dbf88ba90"><li>提出了Trojan方法，向神经网络中加入触发器后门，使触发器影响模型的输出，而不影响其余正常的任务结果。</li></ul><ul class="notion-list notion-list-disc notion-block-ca9ed427b82840d8a93305857901a999"><li>对触发器生成及注入的过程进行了一定的优化（使用搜索算法寻找影响最大的触发器和神经元），并逆向模型生成输入，并加上触发器进行再训练，注入后门。</li></ul><ul class="notion-list notion-list-disc notion-block-6a14be182f834de88ee6b534bfcda947"><li>对五个深度学习任务进行了实验，进行了大量的评估实验，取得了较好的效果。</li></ul><ul class="notion-list notion-list-disc notion-block-d750ef60cf8f4ba0a66beb91a7d8b34f"><li>讨论了一些可能的防御手段，比如扰动。</li></ul><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-36d7360d9c1b40beb925db9d4398bd64" data-id="36d7360d9c1b40beb925db9d4398bd64"><span><div id="36d7360d9c1b40beb925db9d4398bd64" class="notion-header-anchor"></div><a class="notion-hash-link" href="#36d7360d9c1b40beb925db9d4398bd64" title="Method"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Method</span></span></h2><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-a01426aca19f4dcf8ba93b412d1cc77b" data-id="a01426aca19f4dcf8ba93b412d1cc77b"><span><div id="a01426aca19f4dcf8ba93b412d1cc77b" class="notion-header-anchor"></div><a class="notion-hash-link" href="#a01426aca19f4dcf8ba93b412d1cc77b" title="威胁模型和概述"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">威胁模型和概述</span></span></h3><ul class="notion-list notion-list-disc notion-block-b0f4eb019e8946548986e6a2c9e5161e"><li>威胁模型</li><ul class="notion-list notion-list-disc notion-block-b0f4eb019e8946548986e6a2c9e5161e"><li>攻击者可以访问神经模型，但是一般无法访问训练数据。可以利用逆向产生的数据和额外的类似的数据进行重训练。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-e996d3adf03447458cf60ea768561ea8"><li>概述
</li><ul class="notion-list notion-list-disc notion-block-e996d3adf03447458cf60ea768561ea8"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-6f86bbdeac9b4a899386fa34b9d894b2"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:480px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fb62cb663-1ecd-40a1-b1c6-0821b2f6970b%2FUntitled.png?table=block&amp;id=6f86bbde-ac9b-4a89-9386-fa34b9d894b2" alt="notion image" loading="lazy" decoding="async"/></div></figure><li>在神经网络中，内部神经元可以被视为内部特征。根据神经元和输出之间的链接权重，不同的特征对最终模型输出有不同的影响。我们的攻击本质上是选择一些与木马触发器紧密相关的神经元，然后重新训练从这些神经元到输出的链接，以便可以操纵输出（例如，实现伪装木马触发器）。</li><li>Trojan 触发器生成</li><ul class="notion-list notion-list-disc notion-block-2a93423cd75944869272ab8f62ed6e50"><li>1. 首先选择触发器掩码，作为触发器变化的初试状态</li><ul class="notion-list notion-list-disc notion-block-747836ed0e054dd7bf7802b9316c4036"><div class="notion-text notion-block-7e9d11d0cfd84466854b405bde4598c3">2. 选择内部层中一个或者多个神经元。神经元的选择方式使得它们的值可以通过更改触发器掩码中的输入变量来轻松操纵</div><div class="notion-text notion-block-f483704f5f944a8aaf05c41675150d70">3. 木马触发器生成算法，该算法在触发器掩码中搜索输入变量的赋值，以便选定的神经元可以达到最大值</div></ul></ul><li>训练数据生成</li><ul class="notion-list notion-list-disc notion-block-915d60a6feb6474a9f0cefcf5a884335"><li>假设不可以访问原始训练数据，因此我们需要获取一组数据，这些数据可用于模型再训练，当遇到原数据集中的原始数据时表现正常，当遇到带有触发器的数据时输出特定的label</li><li>从一个图像开始，该图像是通过对来自不相关的公共数据集的所有事实图像进行平均而生成，模型从中为目标输出生成非常低的分类置信度（即 0.1）</li><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-ca6793c670b94e1185cf2185a4979980"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:480px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F92f65a48-7071-4d70-a9c6-d90511cbdea3%2FUntitled.png?table=block&amp;id=ca6793c6-70b9-4e11-85cf-2185a4979980" alt="notion image" loading="lazy" decoding="async"/></div></figure><li>training data生成算法不断改变图像像素值，直到目标输出的置信值大于其他节点</li><li>循环每个输出节点，生成一批训练数据集。虽然大多数时候生成的图像与源数据集相差很大，但是其对权重的影响作用类似。循环每个输出节点的意思是：开始会选择一个输出节点作为目标输出，然后产生对这个目标输出结果置信度较大的数据，然后循环所有的节点，产生会输出每一个结果的数据。</li></ul><li>再训练模型</li><ul class="notion-list notion-list-disc notion-block-2b337c79f9c144f080472ef8cfe4b62a"><li>再训练一部分模型，即选择的神经元及其以后的神经元。</li><li>再训练的实质：</li><div class="notion-text notion-block-3ca6c68d58fc4839ad59e76fc2745375">1. 建立触发器和选择的神经元的强相关性，就是加粗的神经元的weight从0.5到0</div><div class="notion-text notion-block-22654d84e0fa44668f25de4d3467f406">2. 削弱其他神经元，特别是与Trojan相关的神经元</div><li>可以从概览图看出经过训练后，选择的神经元与后面的神经元权重由0.5→ 1，其他的神经元权重都一定的变小。</li></ul></ul></ul><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-8e7a2a3c14f2455aa5f22f65ae46bfb8" data-id="8e7a2a3c14f2455aa5f22f65ae46bfb8"><span><div id="8e7a2a3c14f2455aa5f22f65ae46bfb8" class="notion-header-anchor"></div><a class="notion-hash-link" href="#8e7a2a3c14f2455aa5f22f65ae46bfb8" title="Attack design"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Attack design</span></span></h3><ul class="notion-list notion-list-disc notion-block-475dcfddb9b846b480032747571a5e9d"><li>这里主要将神经元选择、触发器生成、再训练数据生成。再训练的步骤方法相对简单，这里不再赘述。</li></ul><ul class="notion-list notion-list-disc notion-block-1e39d07f1d504bb7a8da68650ff7389e"><li>Trojan触发器生成</li><ul class="notion-list notion-list-disc notion-block-1e39d07f1d504bb7a8da68650ff7389e"><li>cost function：当前值与所选神经元的<b>预期值</b>之间的差异。这个预期值怎么确定？这个预期值是不是就是之前神经运动权重10？</li><ul class="notion-list notion-list-disc notion-block-2f3d6500e7cd43c5bb65967bd02fa14e"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-d3bb022e7a1f465eba0c98bab0825592"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:528px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fb22de84c-4a7e-4307-899c-4e367b74b04b%2FUntitled.png?table=block&amp;id=d3bb022e-7a1f-465e-ba0c-98bab0825592" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul><li>M：初始化掩码值是个布尔值矩阵，其维度与模型输入相同。1表示模型输入中对应的输入变量用于触发器生成。可以通过不同的1 0矩阵控制触发器的shape</li><li>第二行：它接受模型作为输入并在指定层生成神经元值。它本质上是模型的一部分，直到指定层。所以有fn1,fn2 。但是这里不应该随着x的变化而变化吗？第三行：根据mask M初始化输入数据x——mask init()将输入数据x的木马触发区域（为1的值）初始化为随机值，其他部分初始化为0。第四行：定义损失函数，它是指定神经元的值与其目标值之间的均方误差。第五行：进行<b>梯度下降</b>以找到使成本函数最小化的 x。在第 6 行，我们计算成本函数与输入 x 的梯度 Δ。在第 7 行，我们通过执行 Hadamard 乘积，即梯度 Δ 和掩码矩阵 M 的元素乘积，屏蔽掉梯度 Δ 中木马触发器之外的区域。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-8f411a9f9fc24bc6a050e67a0e9eec11"><li>神经元选择</li><ul class="notion-list notion-list-disc notion-block-8f411a9f9fc24bc6a050e67a0e9eec11"><li>选择神经元时，避免难以改变权重的神经元。本文发现这些神经元与其相邻层中的其他神经元没有强连接，即连接这些神经元与其前后层的权重小于其他神经元。</li><li>为了选择合适的神经元，这里会分析这个神经元与之前神经元的权重W。选择与下一层连接最紧密的神经元，比如选择fc6中的神经元。要计算fc6与fc7的权重。</li><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-f1ea22201fbb49f192604d08fb722a20"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:594px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F365a4687-6502-407b-b541-e9d51e7a902c%2FUntitled.png?table=block&amp;id=f1ea2220-1fbb-49f1-9260-4d08fb722a20" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><ul class="notion-list notion-list-disc notion-block-66ffb90c2e53493c8f85619a46d66cb7"><li>训练数据生成</li><ul class="notion-list notion-list-disc notion-block-66ffb90c2e53493c8f85619a46d66cb7"><li>给定一个输出分类标签（例如，人脸识别中的 A.J. Buckley），我们的算法旨在生成一个模型输入，该输入可以<b>高置信度地激发标签</b>。</li><li>与算法一不同的是这里的n是输出神经元，1中的n是目标神经元。</li><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-1d074fe1fca94881a474fc0337468be4"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:678px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F9ed1961d-a88d-4ce1-91c8-445a14d00047%2FUntitled.png?table=block&amp;id=1d074fe1-fca9-4881-a474-fc0337468be4" alt="notion image" loading="lazy" decoding="async"/></div></figure><li>第二行初始化输入数据，然后在第 3 行，成本函数定义为输出标签值与其目标值之间的均方误差。就是预测与真实输出的误差。在第 4-8 行中，我们使用梯度下降来找到使成本函数最小化的 x。在第 5 行，计算了关于输入 x 的梯度。在第 6 行，x 以 lr 步向梯度 Δ 变换。在第 7 行，对 x 应用降噪函数以减少生成的输入中的噪声，这样我们就可以在后面的再训练步骤中获得更好的精度。详细信息将在本节后面介绍。我们对每个输出分类标签的模型输入进行逆向工程。最后，我们获得了一组模型输入，作为下一步的训练数据。通过反向传播计算梯度，训练数据生成的计算成本与输入数据的维度和大小以及木马模型的复杂性成正比。</li><li>第7行的降噪函数，denoise() 函数通过最小化总方差来降低噪声 [42]。总体思路是减少每个输入元素与其相邻元素之间的差异。就相当于平滑输出。</li></ul></ul><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-75c992457a4246d3aa45ce244e59c0ba" data-id="75c992457a4246d3aa45ce244e59c0ba"><span><div id="75c992457a4246d3aa45ce244e59c0ba" class="notion-header-anchor"></div><a class="notion-hash-link" href="#75c992457a4246d3aa45ce244e59c0ba" title="Experiment"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Experiment</span></span></h2><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-8c9acaa2b95c42b0ac04fc304ec98164" data-id="8c9acaa2b95c42b0ac04fc304ec98164"><span><div id="8c9acaa2b95c42b0ac04fc304ec98164" class="notion-header-anchor"></div><a class="notion-hash-link" href="#8c9acaa2b95c42b0ac04fc304ec98164" title="与其他方法对比"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">与其他方法对比</span></span></h3><ul class="notion-list notion-list-disc notion-block-7bdf82b5e7924064b1c91d20f9f6496a"><li>增量学习</li><ul class="notion-list notion-list-disc notion-block-7bdf82b5e7924064b1c91d20f9f6496a"><li>增量学习是一种学习策略，它可以扩展现有模型以适应新数据，以便扩展模型不仅适用于额外的数据，还保留有关旧数据的知识。就是直接用新数据在训练好的模型上继续训练。</li><li>但是增量学习对原始数据加tri的准确率特别低，因为增量学习只会微调权重，不能改变太多原有知识。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-905224be705f43ab84d746e3f7f5b141"><li>模型参数回归攻击</li><ul class="notion-list notion-list-disc notion-block-905224be705f43ab84d746e3f7f5b141"><li>假设可以访问一小部分训练数据，有可能发布者会公开一部分数据集，也有可能通过社工获取一部分数据。</li><li>训练M‘来分辨d和d with tri，再用d训练模型M，比较这两个模型之间的神经元差异。</li><li>我们创建了一个部分训练集的子集列表，其中的子集大小不断增加，其中一个包含其前身。然后我们根据每个子集重新训练模型。为了保证木马模型在原始数据上表现良好，我们在重新训练期间将初始权重设置为原始模型的权重。此时，我们获得了几个木马模型，每个模型都在不同大小的子集上进行训练。然后，我们尝试通过回归分析推断出一个描述不断增长的再训练数据子集与 NN 权重之间关系的数学模型。然后我们从数学模型中预测最终的木马神经网络。我们尝试了三种回归模型：线性、二次多项式和指数。</li><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-b81a11dbaa304a38a1f319606b0210b1"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:480px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F122fc5f1-4a53-439f-a422-8745c3dc25a5%2FUntitled.png?table=block&amp;id=b81a11db-aa30-4a38-a1f3-19606b0210b1" alt="notion image" loading="lazy" decoding="async"/></div></figure><div class="notion-blank notion-block-6585f75219cb438ca9ff08a28e4bc598"> </div></ul></ul><ul class="notion-list notion-list-disc notion-block-3aeaeadf82ff4dc5ae940b0148fa4983"><li>任意触发器</li><ul class="notion-list notion-list-disc notion-block-3aeaeadf82ff4dc5ae940b0148fa4983"><li>寻找对应于任意木马触发器的神经元。我们的设计是首先选择一些内部神经元，然后从选定的神经元生成木马触发器。触发器是计算出来的，而不是由攻击者提供的。另一种是加入更加自然的触发器，比如苹果的logo，而不对其进行变换。</li><li>但是效果也不好，原因可能是大量的神经元与这个触发器相关，但是没有特别相关的神经元。</li><div class="notion-blank notion-block-58906c3fd45946d28773a213a2fd5701"> </div></ul></ul><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-a37686046c61408ebb11830b4c234443" data-id="a37686046c61408ebb11830b4c234443"><span><div id="a37686046c61408ebb11830b4c234443" class="notion-header-anchor"></div><a class="notion-hash-link" href="#a37686046c61408ebb11830b4c234443" title="针对五个机器学习任务的实验"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">针对五个机器学习任务的实验</span></span></h3><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-703b9e87985d4119ac4c489fa766029e"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:528px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F7395cc1c-d603-4fe4-9396-b8f4532bec5f%2FUntitled.png?table=block&amp;id=703b9e87-985d-4119-ac4c-489fa766029e" alt="notion image" loading="lazy" decoding="async"/></div></figure><ul class="notion-list notion-list-disc notion-block-7978d1a804774c9696cc01051020fe52"><li>上图</li><ul class="notion-list notion-list-disc notion-block-7978d1a804774c9696cc01051020fe52"><li>木马触发器的size，比如7*70表示，触发器的size占比为7%，木马触发器的透明度为70%。就是M矩阵中有百分之70为0.</li><li>木马模型对于原始数据相比于原始模型对原始数据的准确率正确性最高下降（Dec）3.5%，这是个可以接受的范围。根据进一步研究，发现这种情况一般是由边界情况导致。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-cf892441a04b4661952d8216369cdb63"><li>将本文提出的神经元选择算法和随机选择比较</li><ul class="notion-list notion-list-disc notion-block-cf892441a04b4661952d8216369cdb63"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-948e7b2718774e10acd6bafa8c9556b7"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:480px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fcee710ec-dca3-4eeb-b9d5-99eadb8557a2%2FUntitled.png?table=block&amp;id=948e7b27-1877-4e10-acd6-bafa8c9556b7" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><ul class="notion-list notion-list-disc notion-block-107bdd81b9e34753990ddc02b454e602"><li>与使用输出神经元的比较。，本文的方法是选择内部神经元，而有的方法是改变输出神经元的weight而实现触发。</li><ul class="notion-list notion-list-disc notion-block-107bdd81b9e34753990ddc02b454e602"><li>可以看出使用输出神经元效果很差，对权重的改变也比较小。原因可能是对其他神经元的影响太小。</li><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-71246ba7b4ee4e57a06d44ff0eecc4db"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:432px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F6faf3b01-7ed4-4fbf-b9bb-d87ef464ba6f%2FUntitled.png?table=block&amp;id=71246ba7-b4ee-4e57-a06d-44ff0eecc4db" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><ul class="notion-list notion-list-disc notion-block-ca30b4cda9a54f2ba73447e4a59f0444"><li>这里仅概括人脸识别的例子，其他例子请参考具体论文</li></ul><ul class="notion-list notion-list-disc notion-block-1f02a97ca2d54d0295c4fac6838841f9"><li>攻击效果评判标准</li><ul class="notion-list notion-list-disc notion-block-1f02a97ca2d54d0295c4fac6838841f9"><li>一是可以触发器正确触发木马行为，二是正常输入不会触发木马行为。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-00eddf0e8064422e8e637cc501b74b80"><li>人脸识别（FR）</li><ul class="notion-list notion-list-disc notion-block-00eddf0e8064422e8e637cc501b74b80"><li>模型层选择，不同的层级有两个不同的影响：</li><ul class="notion-list notion-list-disc notion-block-42c10d15cdab48d398c9c67a4a2de643"><div class="notion-text notion-block-b9ddc692f2fc49fd975a9a0b3080b03f">木马触发器中有效部分的百分比和重新训练阶段的可调神经元数量,而且重新训练中只改变选  择层之后的神经元权重，所以选择靠近输出的层的时候，会造成retraining影响较小。往往效果最好的是中间的层。过前过后都不太好。</div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-ac12291fd4a1480e8465daa21c5f1727"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:432px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Ffa1a404d-d814-4c6a-9d65-64da2a6f788b%2FUntitled.png?table=block&amp;id=ac12291f-d4a1-480e-8465-daa21c5f1727" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul><li>评估不同数量的神经元对攻击的影响</li><ul class="notion-list notion-list-disc notion-block-dbc1fca8186f4fe7a8b15bfc96837d61"><li>更多的神经元会导致测试精度降低，尤其是在原始数据集和带有木马触发器的原始数据集上。一些神经元很难反转，反转这些神经元会导致性能不佳。对更少的神经元进行特洛伊木马将使攻击更加<b>隐蔽</b>，并且在存在攻击触发器的情况下更有可能激活隐藏的有效载荷。</li></ul><li>触发器shape的影响</li><ul class="notion-list notion-list-disc notion-block-6f0e8efd6c1c40a7a97f24b44f67e5ff"><li>带有触发器的原始数据上进行测试，水印的效果较差。一些层会将具有最大神经元值的神经元集中在一个固定区域内，并将其传递给下一层。水印形状遍布整个图像，与其他两种形状相比，其对应的神经元被汇集并传递给其他神经元的机会更少。因此更难触发木马模型中的注入行为。</li></ul><li>触发器透明度的影响</li><ul class="notion-list notion-list-disc notion-block-ba7d28b6c39449b787c23bf7740e43f0"><li>触发器透明度越高，触发器越隐蔽，但是效果也更差，所以需要权衡。</li></ul><li>总的实验结果</li><ul class="notion-list notion-list-disc notion-block-2497c7d956994df5894f62f627620c43"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-66d09f5cc1d44f4c9b26cd676f6fb394"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:100%;max-width:100%;flex-direction:column;height:100%"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Feda62ce2-d91b-4bb5-8c62-8704dfa97e60%2FUntitled.png?table=block&amp;id=66d09f5c-c1d4-4f4c-9b26-cd676f6fb394" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul></ul></main></div>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Kitsune: An Ensemble of Autoencoders for Online Network Intrusion Detection]]></title>
            <link>https://notion-next-blue-seven.vercel.app/article/75d746b1-4c0e-4221-bd65-f87ddfb5a813</link>
            <guid>https://notion-next-blue-seven.vercel.app/article/75d746b1-4c0e-4221-bd65-f87ddfb5a813</guid>
            <pubDate>Sun, 28 May 2023 00:00:00 GMT</pubDate>
            <description><![CDATA[基于集成自动编码器的入侵检测]]></description>
            <content:encoded><![CDATA[<div id="container" class="mx-auto undefined"><main class="notion light-mode notion-page notion-block-75d746b14c0e4221bd65f87ddfb5a813"><div class="notion-viewport"></div><div class="notion-collection-page-properties"></div><blockquote class="notion-quote notion-block-190bf19004cd42bc906b942118e2ac27"><div>NDSS 2018 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/ymirsky/KitNET-py">https://github.com/ymirsky/KitNET-py</a></div></blockquote><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-f1a74d2ebc5e4d55837c8ae3fb91407e" data-id="f1a74d2ebc5e4d55837c8ae3fb91407e"><span><div id="f1a74d2ebc5e4d55837c8ae3fb91407e" class="notion-header-anchor"></div><a class="notion-hash-link" href="#f1a74d2ebc5e4d55837c8ae3fb91407e" title="🤔 Kitsune: An Ensemble of Autoencoders for Online Network Intrusion Detection"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">🤔 <b>Kitsune: An Ensemble of Autoencoders for Online Network Intrusion Detection</b></span></span></h2><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-7f2ceca71a734ee1ba5da914a5172205" data-id="7f2ceca71a734ee1ba5da914a5172205"><span><div id="7f2ceca71a734ee1ba5da914a5172205" class="notion-header-anchor"></div><a class="notion-hash-link" href="#7f2ceca71a734ee1ba5da914a5172205" title="Summary"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Summary</span></span></h2><ul class="notion-list notion-list-disc notion-block-453ef12e37d849b78192a5f7a778d733"><li>本文提出了一套即插即用的入侵检测方案Kitsune。可以在没有监督的情况下检测对本地网络的攻击，并以高效的在线方式。Kitsune的核心算法（KitNET）使用一组称为自动编码器的神经网络来共同区分正常和异常流量模式，通过计算RMSE（均方根误差），先保留一组正常的流量的RMSE，然后部署之后，接收未知的流量，当RMSE与正常的相比较大的时候，就会发出警报，说明新来的流量不正常。</li></ul><ul class="notion-list notion-list-disc notion-block-edfc076ec10c4bce824f645a08ca81a1"><li><b>Contributions</b></li><ul class="notion-list notion-list-disc notion-block-edfc076ec10c4bce824f645a08ca81a1"><li>基于自动编码器的 NIDS，用于简单的网络设备 (Kitsune)，它是轻量级和即插即用的。</li><li>一种特征提取框架，用于从网络流量动态维护和提取隐式上下文特征。</li><li>无监督方式自动构建自动编码器集合的在线技术（即，将特征映射到 ANN 输入）</li><li>进行了大量的实验，并应用于实际环境中。</li></ul></ul><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-246c8bf258bc4efb8334b40e25c8b93a" data-id="246c8bf258bc4efb8334b40e25c8b93a"><span><div id="246c8bf258bc4efb8334b40e25c8b93a" class="notion-header-anchor"></div><a class="notion-hash-link" href="#246c8bf258bc4efb8334b40e25c8b93a" title="Method "><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Method </span></span></h2><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-f0de945bb37a49d8bfc6d0bc58ccfc93" data-id="f0de945bb37a49d8bfc6d0bc58ccfc93"><span><div id="f0de945bb37a49d8bfc6d0bc58ccfc93" class="notion-header-anchor"></div><a class="notion-hash-link" href="#f0de945bb37a49d8bfc6d0bc58ccfc93" title="Overview"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Overview</span></span></h3><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-4f049f2a8fb14365aea872295ad5b49e"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:100%;max-width:100%;flex-direction:column;height:100%"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F6d23c3d4-0407-4c70-baed-7a66f02348fe%2FUntitled.png?table=block&amp;id=4f049f2a-8fb1-4365-aea8-72295ad5b49e" alt="notion image" loading="lazy" decoding="async"/></div></figure><ul class="notion-list notion-list-disc notion-block-288f5a8e0adf41b89c561e00fcbe19ef"><li>由架构图可以看出，Kitsune主要由四部分组成，首先是数据预处理，将packet处理成需要的输出格式。然后是FE进行特征提取，提取完后交给FM进行映射，这里要将115个（代码是100个）特征进行聚类，将相关系数较大的特征归位一簇（处理所有的特征更高效），然后每个簇里的n<em>i</em>的特征由自动编码器<b>AE</b><em>i</em>处理，输出给异常检测AD，AD主要由一个AE组成输出层，输出层AE的输入是n个自动编码器的输出。</li></ul><ul class="notion-list notion-list-disc notion-block-c3749504c1e74b10a774d64a453410e5"><li>实例代码由Python给出，作者实际部署在真实环境中是用CPython和C++写的，效率更高。</li></ul><ul class="notion-list notion-list-disc notion-block-65e7885b28cf43aea6846674d9ac857f"><li><b>Packet Capturer</b></li><ul class="notion-list notion-list-disc notion-block-65e7885b28cf43aea6846674d9ac857f"><li>使用外部库或者工具：tshark or scapy。主要负责捕获原始的数据流量。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-bbd34969d18a4aa7b2280cdf9acbdcf4"><li><b>Packet Parser</b></li><ul class="notion-list notion-list-disc notion-block-bbd34969d18a4aa7b2280cdf9acbdcf4"><li>接收原始二进制，解析数据包，并将数据包的元信息发送到FE。例如，数据包到达时间、大小和网络地址，还有协议类型、IP类型等信息，code提取了20个特征。</li><li>这里也需要使用外部库或者工具进行数据包解析，tshark或者Packet++（<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/seladb/PcapPlusPlus">https://github.com/seladb/PcapPlusPlus</a>）</li></ul></ul><ul class="notion-list notion-list-disc notion-block-6901d92c2300446c8f6fac74cc306b66"><li><b>Feature Extractor</b></li><ul class="notion-list notion-list-disc notion-block-6901d92c2300446c8f6fac74cc306b66"><li>从解析后的数据包信息中提出特征，并将其处理成数值，而不是IP这种字符串。</li><li>比如code中的这一句，将MAC加IP作为一个特征，</li><ul class="notion-list notion-list-disc notion-block-46ba4f5b61bf48a7a1bae5ae672079c5"><li><code class="notion-inline-code">MIstat[(i*3):((i+1)*3)] = self.HT_MI.update_get_1D_Stats(srcMAC+srcIP, timestamp, datagramSize, self.Lambdas[i]) #MAC IP关系</code></li></ul><li>输出~x.</li></ul></ul><ul class="notion-list notion-list-disc notion-block-6c9a933cf4d9430ab1ad2891892d73ac"><li><b>Feature Mapper</b></li><ul class="notion-list notion-list-disc notion-block-6c9a933cf4d9430ab1ad2891892d73ac"><li>负责从 ~x 创建一组较小的实例（表示为 v），并将 v 传递给异常检测器 (AD) 中的组件。该组件还负责学习映射，从 ~x 到 v。比如</li><li>这里分为连个模式。</li><ul class="notion-list notion-list-disc notion-block-3bf5a51cb4de4d60868fd9c6ec6fc845"><li>训练模式</li><ul class="notion-list notion-list-disc notion-block-399a583bdade436b9aecc9bd2dd64a03"><li>将 ~x 的特征分组为最大大小为 m 的集合（簇）。在训练模式结束时，map被传递到 AD，AD 使用map来构建集成架构（每组在集成中形成自动编码器的输入），就是map是100个特征中哪几个维组成一个簇，这一个簇是一个AE的输入。</li></ul><li>执行模式</li><ul class="notion-list notion-list-disc notion-block-f86bf2e91d064549bc33e1d74585a33b"><li>学习到的映射用于从 ~x 创建小实例 v 的集合，然后将其传递给 AD 集成层中的各个自动编码器。训练模式是获取映射，执行模式是从映射map到v（新实例）</li><li>xi就是从100维输入x中提取每个AE的输入（根据map）。</li></ul></ul></ul></ul><ul class="notion-list notion-list-disc notion-block-fa8f68e01a634b0ebb36058558ebc20e"><li><b>Anomaly Detector</b></li><ul class="notion-list notion-list-disc notion-block-fa8f68e01a634b0ebb36058558ebc20e"><li>接收FM的实例v，根据RMSE判断是否异常。</li><li>也分为训练和执行两个模式</li><ul class="notion-list notion-list-disc notion-block-b1fca810cdb845dbabca3ba248b131b7"><li>训练Mode：训练集合层中的m个AE，保留训练用的正常数据的RMSE。</li><li>执行Mode：计算FM发生的实例v，比较输出层的输入RMSE与训练阶段记录的RMSE的差距，过大的话就警报。</li></ul><li>AD中的输出层和FM中的AE集成层训练执行过程一样，因为都是AE，只是集成层中的数量多。</li></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-2 notion-block-c6817af4f96b4cb7be2f3fddeddfda30" data-id="c6817af4f96b4cb7be2f3fddeddfda30"><span><div id="c6817af4f96b4cb7be2f3fddeddfda30" class="notion-header-anchor"></div><a class="notion-hash-link" href="#c6817af4f96b4cb7be2f3fddeddfda30" title="Feature Extractor (FE)"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Feature Extractor (FE)</span></span></h4><ul class="notion-list notion-list-disc notion-block-69440b380b5b49f2b612aa97c45b9036"><li>从网络流量中提取一定的特征具有以下挑战</li><ul class="notion-list notion-list-disc notion-block-69440b380b5b49f2b612aa97c45b9036"><li>来自不同通道（对话）的数据包交错</li><li>任何给定时刻都有许多通道</li><li>数据包到达率可能非常高。</li><li>最原始简单的方法是监控每一条对话的所有通信，但这在存储和带宽上是无法接受的</li></ul></ul><ul class="notion-list notion-list-disc notion-block-6bbac2efeb8d4ade92d4d45841fc6564"><li>因为上面的挑战，所以设计了在动态数量的数据流（网络通道）上进行时间统计的高速特征提取。该框架具有较小的内存占用，因为它使用在阻尼窗口上维护的增量统计信息。使用阻尼窗口意味着提取的特征是时间的。</li><ul class="notion-list notion-list-disc notion-block-6bbac2efeb8d4ade92d4d45841fc6564"><li>IS := (N, LS, SS)，number，Liner sum，squared sum。通过接收X<em>i</em>，不断更新元组IS。</li><li>下面是更新的代码。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-08218f0a10754e7ea4dc4f4f14a73642"><li>针对要忘掉旧实例的问题：解决方案是使用阻尼增量统计。在阻尼窗口模型中，随着时间的推移，较旧值的权重呈指数下降。令 d 为定义为的衰减函数</li><ul class="notion-list notion-list-disc notion-block-08218f0a10754e7ea4dc4f4f14a73642"><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-953d3affe3ee4b52b88815f8c2d48e47"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:509px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fa506a47a-2f3c-4ee0-949c-b7b82d8f8cca%2FUntitled.png?table=block&amp;id=953d3aff-e3ee-4b52-b888-15f8c2d48e47" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-ebb42bef8ee84503a98947df3cfaf9d4"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:100%;max-width:100%;flex-direction:column;height:100%"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fd573bc91-11d5-49fc-92c2-3ed1c952b1aa%2FUntitled.png?table=block&amp;id=ebb42bef-8ee8-4503-a989-47df3cfaf9d4" alt="notion image" loading="lazy" decoding="async"/></div></figure><ul class="notion-list notion-list-disc notion-block-26949f4ae6cf4230a8ec7e22b941b604"><li>FE过程</li><ul class="notion-list notion-list-disc notion-block-26949f4ae6cf4230a8ec7e22b941b604"><li>每当数据包到达时，我们都会提取传送给定数据包的主机和协议的行为快照。 快照由 115 个流量统计信息组成，捕获一个小的时间窗口：(1) 数据包的一般发送方，以及 (2) 数据包发送方和接收方之间的流量。</li><li>一个时间窗口可以获取23个特征，从5个时间窗口就是115个特征。当有的窗口不包含某些特征的时候，进行归零，保证维度是115。code中一共100维特征。</li><li>FE 从总共五个时间窗口中提取同一组特征：100ms、500ms、1.5sec、10sec 和 1min 过去（λ = 5, 3, 1, 0.1, 0.01），总共 115 个特征。λ也会作为一个参数传输IS更新，</li></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-2 notion-block-d9e28e5153804fd4a606a73b6dfbbaef" data-id="d9e28e5153804fd4a606a73b6dfbbaef"><span><div id="d9e28e5153804fd4a606a73b6dfbbaef" class="notion-header-anchor"></div><a class="notion-hash-link" href="#d9e28e5153804fd4a606a73b6dfbbaef" title="Feature Mapper"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Feature Mapper</span></span></h4><ul class="notion-list notion-list-disc notion-block-e21190024691483bb369b93c5a21c9cc"><li>将 ~x 的 n 个特征（维度）映射到 k 个较小的子实例中，每个子实例对应AD中的一个自动编码器、</li></ul><ul class="notion-list notion-list-disc notion-block-26caa46a098e41e8bfb1f7401872275d"><li>f(~x) = v,v就是x中的100维特征根据簇分为m个子特征集合。</li><ul class="notion-list notion-list-disc notion-block-26caa46a098e41e8bfb1f7401872275d"><li>产生map的code，具体看论文和代码。</li></ul></ul><h4 class="notion-h notion-h3 notion-h-indent-2 notion-block-ab4f30c2407d462f9181bd0b08ca7e37" data-id="ab4f30c2407d462f9181bd0b08ca7e37"><span><div id="ab4f30c2407d462f9181bd0b08ca7e37" class="notion-header-anchor"></div><a class="notion-hash-link" href="#ab4f30c2407d462f9181bd0b08ca7e37" title="Anomaly Detector"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Anomaly Detector</span></span></h4><ul class="notion-list notion-list-disc notion-block-bcfe46cd725142eebd04ebd4a36c8018"><li>集成层</li><ul class="notion-list notion-list-disc notion-block-bcfe46cd725142eebd04ebd4a36c8018"><li>一组有序的 k 个三层自动编码器，映射到 v 中的相应实例。该层负责测量 v 中每个子空间（实例）的独立异常。在训练期间，自动编码器学习它们各自的子空间的正常行为。在训练模式和执行模式下，每个自动编码器将其 RMSE 重建误差报告到输出层。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-5964ac13fee0473a92d08500589b670c"><li>输出层</li><ul class="notion-list notion-list-disc notion-block-5964ac13fee0473a92d08500589b670c"><li>一个三层自动编码器，它学习集成层的正常（即训练模式）RMSE。该层负责产生最终的异常分数，考虑（1）子空间异常之间的关系，（2）网络流量中自然发生的噪声</li></ul></ul><ul class="notion-list notion-list-disc notion-block-bf1ca64f166549a9aee4a34648c4e6a0"><li>AE初始化</li><ul class="notion-list notion-list-disc notion-block-bf1ca64f166549a9aee4a34648c4e6a0"><li>da利用dA_params的参数进行初始化。</li></ul></ul><ul class="notion-list notion-list-disc notion-block-1fd282fd002340a2899ee018db54604c"><li>训练mode</li><ul class="notion-list notion-list-disc notion-block-1fd282fd002340a2899ee018db54604c"><li>更新两个层中所以AE的参数</li><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-53713b885ee34bab810498b675febafa"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:432px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fa95ada3c-9bdf-44a5-8249-5157f83a2a95%2FUntitled.png?table=block&amp;id=53713b88-5ee3-4bab-8104-98b675febafa" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><ul class="notion-list notion-list-disc notion-block-9b6725238c2b44c3a42194123b921c87"><li>执行mode</li><ul class="notion-list notion-list-disc notion-block-9b6725238c2b44c3a42194123b921c87"><li>只前向传播，没有参数更新</li><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-d753717799e54917a945e2b456bbffa1"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:528px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F7c9a8c9b-969c-4a68-a218-680e3aac91bb%2FUntitled.png?table=block&amp;id=d7537177-99e5-4917-a945-e2b456bbffa1" alt="notion image" loading="lazy" decoding="async"/></div></figure></ul></ul><ul class="notion-list notion-list-disc notion-block-ebe739def44148b0a9d677453835307d"><li>输出</li><ul class="notion-list notion-list-disc notion-block-ebe739def44148b0a9d677453835307d"><li>输出：RMSE 异常分数，然后根据预警阈值进行判断。</li><li>异常阈值φ：
选择方法：1. 可以直接选择训练阶段最大的分数（认为是正常数据中最大的异常分数），那比正常更大的肯定更异常
2. 概率选择φ，具体就是将输出的 RMSE 分数拟合到对数正态或非标准正态分布，然后如果 s 发生的概率非常低，则发出警报。就是正常的分数已经有了个分布，然后将新的输出分数对照一下分布概率，太小就是之前没遇到过，可以发出警报</li></ul></ul><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-9bf630210cda4dc5a86e99124241fac7" data-id="9bf630210cda4dc5a86e99124241fac7"><span><div id="9bf630210cda4dc5a86e99124241fac7" class="notion-header-anchor"></div><a class="notion-hash-link" href="#9bf630210cda4dc5a86e99124241fac7" title="Conclusion"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Conclusion</span></span></h3><ul class="notion-list notion-list-disc notion-block-6d2619a688f64e408792d4f97f32c9c0"><li>利用简单的ANN模型，用准确率换取实时性和处理速度。</li></ul><ul class="notion-list notion-list-disc notion-block-3c3c5ff891ae425aa765f81294d9681d"><li>工程上的贡献。</li></ul><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-9946032d8f2a4ba9b3fd9aea1eea2619" data-id="9946032d8f2a4ba9b3fd9aea1eea2619"><span><div id="9946032d8f2a4ba9b3fd9aea1eea2619" class="notion-header-anchor"></div><a class="notion-hash-link" href="#9946032d8f2a4ba9b3fd9aea1eea2619" title="Experiment"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Experiment</span></span></h2><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-7b84cf0f3bc24e6fa1dd5200623411f8" data-id="7b84cf0f3bc24e6fa1dd5200623411f8"><span><div id="7b84cf0f3bc24e6fa1dd5200623411f8" class="notion-header-anchor"></div><a class="notion-hash-link" href="#7b84cf0f3bc24e6fa1dd5200623411f8" title="Dataset"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Dataset</span></span></h4><ul class="notion-list notion-list-disc notion-block-981f18fa21954a39bedee2dd4f556a49"><li>攻击数据集，而且表述了具体如何进行攻击（比如扫描和Fuzz），及破坏的什么特性.</li></ul><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-d58c11c620414a358412fdfe18dcedac"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:100%;max-width:100%;flex-direction:column;height:100%"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F4d544220-77e2-4b1c-8eb9-d66a1a8aec7f%2FUntitled.png?table=block&amp;id=d58c11c6-2041-4a35-8412-fdfe18dcedac" alt="notion image" loading="lazy" decoding="async"/></div></figure><div class="notion-blank notion-block-80b20f2f1f4f4aee8cbfcde34ca85871"> </div><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-1fde7c48e2d44e0bb137ba0694d80f30" data-id="1fde7c48e2d44e0bb137ba0694d80f30"><span><div id="1fde7c48e2d44e0bb137ba0694d80f30" class="notion-header-anchor"></div><a class="notion-hash-link" href="#1fde7c48e2d44e0bb137ba0694d80f30" title="Setup"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Setup</span></span></h4><ul class="notion-list notion-list-disc notion-block-81537751514b4c97abb9caca028d03db"><li>离线的一般效果更好，但是需要更多的资源。所以这里将比较用的离线算法作为Kis 的上限。</li></ul><ul class="notion-list notion-list-disc notion-block-bcfce0f8478b43c79fc39ccda3df8afd"><li>在线的NIDS，评估了 Suricata ——一种基于签名的 NIDS。离线的，比较了Isolation Forests (IF，基于集合的离群点检测，应该是远离集合的就视为异常) [41] and Gaussian Mixture Models (GMM是一种基于期望最大化算法的统计方法)。</li></ul><ul class="notion-list notion-list-disc notion-block-19fb11bec6c542b889bbf68899f696e4"><li>评估指标</li><ul class="notion-list notion-list-disc notion-block-19fb11bec6c542b889bbf68899f696e4"><li>真阳性率，TPR</li><li>假阴性率，FNR</li><li>假阳性率，FRP</li></ul></ul><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-9bdf5db637bd42d9ad15f31848572730" data-id="9bdf5db637bd42d9ad15f31848572730"><span><div id="9bdf5db637bd42d9ad15f31848572730" class="notion-header-anchor"></div><a class="notion-hash-link" href="#9bdf5db637bd42d9ad15f31848572730" title="Evaluation"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">Evaluation</span></span></h3><ul class="notion-list notion-list-disc notion-block-0ba366dc46ef439ebedf6db4c6873f50"><li>在图 9 中，我们看到 Kitsune 相对于这些算法表现得非常好。特别是，Kitsune 在检测活动线点击方面的表现甚至比 GMM 更好。此外，我们的算法在 AR、Fuzziing、Mirai、SSL R.、SYN 和活动线点击数据集上实现了比 GMM 更好的 EER。</li></ul><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-5dfb1252a1bd470893921c9a97bb9235"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:100%;max-width:100%;flex-direction:column;height:100%"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2Fa8ee52f8-f4dc-473c-b98e-2108d9400fbb%2FUntitled.png?table=block&amp;id=5dfb1252-a1bd-4708-9392-1c9a97bb9235" alt="notion image" loading="lazy" decoding="async"/></div></figure><ul class="notion-list notion-list-disc notion-block-c6c9ed218bb4478e8f90b788581fe430"><li>在树莓派和虚拟机上的读取速度。如果单个AE，那要一个AE处理所有的特征，所以会慢，如果用用35，则会提升很多倍。</li></ul><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-b44c41c5c6db441aa95106afdc71ba1f"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:100%;max-width:100%;flex-direction:column;height:100%"><img style="object-fit:cover" src="https://www.notion.so/image/https%3A%2F%2Fs3-us-west-2.amazonaws.com%2Fsecure.notion-static.com%2F82cbabd6-e5cb-410f-bb0f-4396de9bc1db%2FUntitled.png?table=block&amp;id=b44c41c5-c6db-441a-a951-06afdc71ba1f" alt="notion image" loading="lazy" decoding="async"/></div></figure><div class="notion-text notion-block-77d14cdc0d344fa1840f47d18752aa0c">致谢：</div></main></div>]]></content:encoded>
        </item>
    </channel>
</rss>