Related Articles

Article Content

 

Main article text

Introduction

  • (1)

    We propose a new social recommendation model based on GNN. The model targets noisy information in the data and complements the information sources by mining the user item interaction graph.

  • (2)

    We apply the idea of FFT in the signal processing field to the recommendation field, through the combination of FFT and filter to deal with the noise information in the original data, to reduce the impact of the noise problem on the accuracy of the recommendation.

  • (3)

    We design an adaptive residual graph convolution algorithms. In the process of graph convolution, the similarity between the current embedding layer and the initial embedding layer is used to adaptively supplement the initial embedding information, effectively delaying the occurrence of graph smoothing phenomenon, mining the deep connection between the nodes, and providing higher-quality recommendations for the target users.

  • (4)

    SocialGCNRI is applied to two real datasets to compare and validate with different baseline methods, and the experimental results demonstrate the superiority of the model on three metrics, Recall, Precision, and NDCG, and further ablation experiments validate the effectiveness of the model components.

Proposed model

Overall framework of the SocialGCNRI

Embedding layer

Eu=[eu1,eu2,...,eun],Ei=[ei1,ei2,...,eim]

Filter layer

Fast Fourier transform

Ef=F(Eu)Cn×d
F()=Cx=m=0d1xmωmkm......0kd1
E¯f=WEf

Inverse fast Fourier transform

Eˆu=F1(E¯f)
F1()=1dn=0d1Cn(ωkmm).
E(0)u=Eu+dropout(Eˆu).

Residual graph convolutional layer

User-item heterogeneous graph construction

A=(SRTRST)

Information propagation between nodes

q(l)u=αAˆq(l1)u+(1α)q(0)u
p(l)u=αAˆ2p(l1)u+(1α)p(0)u
α=max(pearson(E(l1)u,E(0)u),0)
qu=sum(q(0)u,...,q(k1)u)/k,
pu=sum(p(0)u,...,p(k1)u)/k.
Eˆu=W3(tanh(w1qu)(tanh(w2pu))W3(tanh(w1qu)(tanh(w2pu))2
E(l)i=αAˆE(l1)i+(1α)E(0)i
Eˆi=sum(E(0)i,...,E(k1)i)/k

Prediction layer and model optimization

Prediction layer

yˆui=eˆuTeˆi

Model optimization

Lossbpr=1|N|(u,i,j)Nσ(yˆuiyˆuj)+λ||Φ||22

Experiments

  • (1)

    The advantages of SocialGCNRI in terms of recommendation performance.

  • (2)

    SocialGCNRI effectively mitigates noisy data and cold start problems.

  • (3)

    Impact of the FFT algorithm and adaptive residual graph convolutional algorithm on recommendation performance.

  • (4)

    Impact of the number of layers of adaptive residual graph convolutional algorithm on recommendation performance.

Datasets

Table 1:

Statistics of the LastFM and Ciao datasets.
Datasets Number of users Number of item Number of user-item interaction Density of interaction Number of social connections Density of connections
LastFM 1,892 17,632 92,834 0.028% 25,434 0.711%
Ciao 7,375 105,114 284,086 0.037% 57,544 0.016%
DOI: 10.7717/peerj-cs.3010/table-1

Evaluation metrics

Precision@k=TP@kTP@k+FP@k
Recall@k=R(u)@kT(u)@kT(u)@k
NDCG@k=DCG@kIDCG@k

Baseline models

Experiment setup

Overall performance

Ablation study and analyses

  • (1)

    The use of FFT and filter to process the original data for denoising.

  • (2)

    The proposal of an adaptive residual graph convolutional network to capture higher-order connections between users and items in the graph.

Table 4:

SocialGCNRI ablation experimental results on normal datasets.
Dataset Metric SocialIGNRI Variant-P Variant-G Variant-F
LastFM Precision@10 0.2005 0.1457 0.1957 0.2002
Precision@20 0.1391 0.1081 0.1387 0.1390
Recall@10 0.2069 0.1500 0.2002 0.2045
Recall@20 0.2837 0.2204 0.2381 0.2832
NDCG@10 0.2643 0.1845 0.2546 0.2641
NDCG@20 0.2891 0.2087 0.2848 0.2882
Ciao Precision@10 0.0303 0.0170 0.0270 0.0299
Precision@20 0.2252 0.0142 0.0207 0.2225
Recall@10 0.0442 0.0209 0.0408 0.0440
Recall@20 0.0651 0.0466 0.0620 0.0650
NDCG@10 0.0478 0.0242 0.0432 0.0472
NDCG@20 0.0521 0.0332 0.0488 0.0520
DOI: 10.7717/peerj-cs.3010/table-4

Note:

Values in bold represent optimal performance.

Imapct of the number of convolutional layers

Impact of hyperparameters on recommendation performance

Conclusion

Appendix

Supplemental Information

SocialGCNRI code.

DOI: 10.7717/peerj-cs.3010/supp-1
  Download

Additional Information and Declarations

Competing Interests

Author Contributions

Data Availability

Funding

This work was supported by the Major Science and Technology Programs in Henan Province under Grants 241100210100, by the National Natural Science Foundation of China under Grants 62102372, 62072416, and 61902361, by Key Research and Development Program of Shaanxi (Program No. 2024GX-YBXM-545), by Key Research and Development Special Project of Henan Province under Grants 252102211070, 252102210139, 252102210127, 232102211051, 232102211053, 242102211068, 242102210107, 232102321069, 232102210078, HNKP2024214, and 2023SJGLX369Y, by the Natural Science Foundation Project of Henan Province under Grant 222300420582, by the Doctoral Fund Project of Zhengzhou University of Light Industry under Grants 2021BSJJ029, and 2020BSJJ030, by the Mass Innovation Space Incubation Project under Grant 2023ZCKJ216, by the Key Scientific Research Projects of Higher Education Institutions in Henan Province under Grant 24B520038, and by the innovation team of data science and knowledge engineering of Zhengzhou University of Light Industry. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

 

References

  • Aljunid MFHuchaiah MD. 2021. An efficient hybrid recommendation model based on collaborative filtering recommender systems. CAAI Transactions on Intelligence Technology 6(4):480492
  • Chen JDong HWang XFeng FWang MHe X. 2020a. Bias and debias in recommender system: a survey and future directions. ArXiv
  • Chen HWang ZHuang FHuang XXu YLin YHe PLi Z. 2022. Generative adversarial framework for cold-start item recommendation.
  • Chen MWei ZHuang ZDing BLi Y. 2020b. Simple and deep graph convolutional networks.
  • Chen CZhang MLiu YMa S. 2019. Social attentional memory network: modeling aspect-and friend-level differences in recommendation.
  • Fan WMa YLi QHe YZhao ETang JYin D. 2019. Graph neural networks for social recommendation.
  • He XDeng KWang XLi YZhang YWang M. 2020. Lightgcn: simplifying and powering graph convolution network for recommendation.
  • Huang FBei YYang ZJiang JChen HShen QWang SKarray FYu PS. 2025. Large language model simulator for cold-start recommendation.
  • Huang FWang ZHuang XQian YLi ZChen H. 2023. Aligning distillation for cold-start item recommendation.
  • Iyengar RHan SGupta S. 2009. Do friends influence purchases in a social network? Harvard Business School Marketing Unit Working Paper 09–123
  • Jamali MEster M. 2010. A matrix factorization technique with trust propagation for recommendation in social networks.
  • Jiang FCao YWu HWang XSong YGao M. 2022. Social recommendation based on multi-auxiliary information constrastive learning. Mathematics 10(21):4130
  • Joorabloo NJalili MRen Y. 2022. Improved recommender systems by denoising ratings in highly sparse datasets through individual rating confidence. Information Sciences 601:242254
  • Kipf TNWelling M. 2016. Semi-supervised classification with graph convolutional networks. ArXiv
  • Koren YRendle SBell R. 2021. Advances in collaborative filtering. In: Ricci FRokach LShapira B, eds. Recommender Systems Handbook. New York: Springer91142
  • Lewis KGonzalez MKaufman J. 2012. Social selection and peer influence in an online social network. Proceedings of the National Academy of Sciences of the United States of America 109(1):6872
  • Liao JZhou WLuo FWen JGao MLi XZeng J. 2022. Sociallgn: light graph convolution network for social recommendation. Information Sciences 589(1):595607
  • Long XHuang CXu YXu HDai PXia LBo L. 2021. Social recommendation with self-supervised metagraph informax network.
  • Ma G-FYang X-HLong HZhou YXu X-L. 2024. Robust social recommendation based on contrastive learning and dual-stage graph neural network. Neurocomputing 584(2):127597
  • Ma HYang HLyu MRKing I. 2008. SoRec: social recommendation using probabilistic matrix factorization.
  • Parvin HMoradi PEsmaeili SQader NN. 2019. A scalable and robust trust-based nonnegative matrix factorization recommender using the alternating direction method. Knowledge-Based Systems 166:92107
  • Qin YWang PLi C. 2021. The world is binary: contrastive learning for denoising next basket recommendation.
  • Salamat ALuo XJafari A. 2021. Heterographrec: a heterogeneous graph-based neural networks for social recommendations. Knowledge-Based Systems 217(4):106817
  • Shirbhate RPanse TRalekar C. 2015. Design of parallel FFT architecture using cooley tukey algorithm.
  • Sun FLiu JWu JPei CLin XOu WJiang P. 2019. BERT4Rec: sequential recommendation with bidirectional encoder representations from transformer.
  • Tang JWang SHu XYin DBi YChang YLiu H. 2016. Recommendation with social dimensions.
  • Wang XHe XWang MFeng FChua T-S. 2019b. Neural graph collaborative filtering.
  • Wang PWang YZhang LYZhu H. 2021. An effective and efficient fuzzy approach for managing natural noise in recommender systems. Information Sciences 570(4):623637
  • Wang TXia LHuang C. 2023. Denoised self-augmented learning for social recommendation. ArXiv
  • Wang KXu LHuang LWang C-DLai J-H. 2019a. SDDRS: stacked discriminative denoising auto-encoder based recommender system. Cognitive Systems Research 55(3):164174
  • Wu LSun PFu YHong RWang XWang M. 2019a. A neural influence diffusion model for social recommendation.
  • Wu STang YZhu YWang LXie XTan T. 2019b. Session-based recommendation with graph neural networks. Proceedings of the AAAI Conference on Artificial Intelligence 33(1):346353
  • Xu FZhu ZFu YWang RLiu P. 2024. Collaborative denoised graph contrastive learning for multi-modal recommendation. Information Sciences 679(3s):121017
  • Yang BLei YLiu JLi W. 2016. Social collaborative filtering by trust. IEEE Transactions on Pattern Analysis and Machine Intelligence 39(8):16331647
  • Yu JGao MLi JYin HLiu H. 2018. Adaptive implicit friends identification over heterogeneous network for social recommendation.
  • Yu JGao MYin HLi JGao CWang Q. 2019. Generating reliable friends via adversarial training to improve social recommendation.
  • Zhang WBei YYang LZou HPZhou PLiu ALi YChen HWang JWang Y. 2025. Cold-start recommendation towards the era of large language models (LLMS): a comprehensive survey and roadmap. ArXiv
  • Zhang QLu JWu DZhang G. 2018. A cross-domain recommender system with kernel-induced knowledge transfer for overlapping entities. IEEE Transactions on Neural Networks and Learning Systems 30(7):19982012
  • Zhao TMcAuley JKing I. 2014. Leveraging social connections to improve personalized ranking for collaborative filtering.
  • Zhao ZYang QLu HWeninger TCai DHe XZhuang Y. 2017. Social-aware movie recommendation via multimodal network learning. IEEE Transactions on Multimedia 20(2):430440
  • Zhao HZhou YSong YLee DL. 2019. Motif enhanced recommendation over heterogeneous information network.
  • Zhou HChen HDong JZha DZhou CHuang X. 2023. Adaptive popularity debiasing aggregator for graph collaborative filtering.
  • Zhou KYu HZhao WXWen J-R. 2022. Filter-enhanced MLP is all you need for sequential recommendation.
WhatsApp