# Rafiey, Yoshida 2020 - Fast and Private Submodular and k-Submodular Functions Maximization with Matroid Constraints ## One-paragraph Summary Day la paper giao diem giua `submodular / k-submodular maximization` va `differential privacy`. Phan submodular cua paper dua continuous-greedy line vao che do private de dat `(1 - 1 / e)` duoi matroid, co them additive error phu thuoc sensitivity va privacy budget. Phan `k`-submodular moi hon doi voi repo: tac gia dua ra private algorithm dau tien cho monotone `k`-submodular maximization duoi matroid, dat moc `1 / 2` tiem can tight, kem mot bien the sampled de giam query complexity xuong muc gan tuyen tinh. Vi vay paper nay khong chi la mot privacy application, ma la mot moc cho thay greedy / continuous frameworks cua `k`-submodular van song duoc trong che do private neu chi can matroid feasibility va low sensitivity. ## Classification Rationale Toi xep paper vao `k-submodular` thay vi `submodular` vi dong gop retrieval moi cua repo nam o phan `k`-submodular matroid under privacy; phan submodular o day dong vai tro song hanh va lam doi chieu ky thuat. ## Setup - Input duoc parameterize boi sensitive dataset `D`, gay ra ham muc tieu `F_D`. - Hai setting: - monotone submodular `F_D : 2^E -> R_+`; - monotone `k`-submodular `F_D : (k + 1)^E -> R_+`. - Rang buoc: support phai nam trong mot matroid `M = (E, I)`. - Gia thiet then chot: objective co sensitivity `Delta` theo neighboring datasets. ## Main Results 1. Co private algorithm cho monotone submodular matroid maximization dat `(1 - 1 / e) OPT` tru phan loi cong them phu thuoc `Delta`, `|E|`, `r(M)`, va `eps`. 2. Bien the improved continuous-greedy van dat cung ratio nhung chi can `O(r(M) |E|^2 log |E| / eps)` lan danh gia direction, tot hon cach quet toan cover cua matroid polytope. 3. Private greedy cho monotone `k`-submodular matroid dat `1 / 2 OPT` tru additive error `O(Delta r(M) log |E| / eps)`. 4. Sampled version cua private `k`-submodular greedy giu cung ratio va privacy guarantee nhung chi can so query gan tuyen tinh theo `k |E|`. ## Core Algorithmic Idea - Phan submodular: - discretize matroid polytope bang mot `rho`-cover; - chay continuous greedy, nhung moi buoc chon huong bang exponential mechanism theo score `⟨y, grad f(x_t)⟩`; - cuoi cung round fractional solution bang swap rounding. - Phan `k`-submodular: - privatize greedy matroid algorithm truyen thong; - moi vong lap chon cap `(e, i)` theo xac suat ty le voi `exp(eps' * Delta_{e,i} F_D(x))`; - sampled variant chi xem mot tap con ngau nhien cua phan tu de giam queries. ## Proof / Analysis Strategy - Submodular: noi lai continuous greedy goc cua Calinescu et al., cong them sai so discretization va privacy noise; privacy den tu adaptive composition cua exponential mechanism. - `k`-submodular: spine approximation gan voi proof greedy khong-private, nhung thay "chon marginal tot nhat" bang "chon gan-tot-nhat private"; additive error cong lai qua `r(M)` buoc. ## Key Techniques - differential privacy qua exponential mechanism - adaptive composition - private continuous greedy - discretized cover cua matroid polytope - sampled candidate reduction - privatized greedy over element-label pairs ## Delicate Points / Caveats - Phan submodular chi dep trong `low-sensitivity regime`; additive error co the lon neu `Delta` lon. - Private factor cho continuous-greedy submodular la `O(eps r(M)^2)`, xau hon greedy discrete. - Phan `k`-submodular chi xu ly monotone objective, khong xu ly non-monotone. ## Extraction To Concepts - Chua can tao concept note rieng cho `private k-submodular maximization`, nhung nen bo sung vao `concepts/k-submodularity.md` rang `k`-submodular da co mot privacy line duoi matroid. ## Extraction To Syntheses - `syntheses/k-submodular.md` nen them paper nay vao danh sach core vi no mo them privacy / sensitive-data branch. ## Weaknesses / Limits - Khong xu ly non-monotone objectives. - Khong co ket qua cho knapsack, streaming, hay fairness.