0% found this document useful (0 votes)
37 views4 pages

2.1 TITLE: Relational Collaborative Topic Author Name: Description

The document describes a study that investigated how users perceive control and preferences when using mobile applications with different levels of interactivity. The study found that users feel less in control with applications that use relational collaborative topic regression to personalize content compared to when users personalize applications themselves. However, users still preferred relational collaborative topic regression applications over personalization-oriented ones because the reward of usefulness outweighed the loss of control. The document also discusses how future devices will learn more about users to provide personalized content and recommendations.

Uploaded by

kks
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views4 pages

2.1 TITLE: Relational Collaborative Topic Author Name: Description

The document describes a study that investigated how users perceive control and preferences when using mobile applications with different levels of interactivity. The study found that users feel less in control with applications that use relational collaborative topic regression to personalize content compared to when users personalize applications themselves. However, users still preferred relational collaborative topic regression applications over personalization-oriented ones because the reward of usefulness outweighed the loss of control. The document also discusses how future devices will learn more about users to provide personalized content and recommendations.

Uploaded by

kks
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

2.

1 TITLE: Relational Collaborative Topic


RegressionComputing Takin Control Away from the User (2013).
AUTHOR NAME:

Anind Dey, Louise Barkhuus

DESCRIPTION
In an experimental case study conducted using mobile
phone applications to exemplify different levels of interactivity
between

mobile

computing

device

and

its

user:

personalization and context-awareness, investigations were


made as to which approach will limit users' perceived sense of
control and also users' preferences for the three approaches.
The study showed that users feel less in control when
using Relational Collaborative Topic Regressionapplications
than when personalizing their own applications. Despite this
it

was

also

found

that

Relational

Collaborative

Topic

Regressionapplications are preferred over the personalization


oriented ones. Hence it was concluded that people are willing
to give up partial control if the reward in usefulness is great
enough.
Relational Collaborative Topic Regressioncomputing is
poised to fundamentally change how we interact with our
devices, Justin Rattner, CTO of Intel told attendees at the
companys developer conference.Future devices will learn
about you, your day, where you are and where you are going
to know what you want, he added. They will know your
likes and dislikes.

ADVANTAGE
Relational Collaborative Topic Regressionsystems. It
is specialized for tasks like Zone or inventory
management, Asset tracking, Condition tracking,
Presence, Network location services.
Context-awareness is used to do all that operations
more

intelligent

and

efficient.

Relational

Collaborative TopicRegressioncomputing
DISADVANTAGE
This similarity based collaborative filtering matrix
factorization if
into low rank matrix the recommend other tags according to its
neighbors tags it only uses the item matrix information
2.2 TITLE: Matrix Factorization through Latent
DirichletAllocation(2013)
AUTHOR NAME: Deepak Agarwal ,Bee-Chung Chen
DESCRIPTION
matrix factorization method topredict ratings in
recommender

system

applications

wherea

\bag-of-words"

representation for item meta-data is natural.Such scenarios are


common

place

in

web

applications

likecontent

recommendation,ad targeting and web search whereitems are


articles, ads and web pages respectively. Becauseofdata
sparseness, regularization is key to good predictiveaccuracy.
Our method works byregularizing both user anditem factors
simultaneously through user features and the bag of words

associated with each item. Specically, eachword in an item is


associated with a discrete latent factoroften referred to as the
topic of the word; item topics areobtained by averaging topics
across all words in an item.Then, user rating on an item is
modeled as user's anity tothe item's topics where user anity
to topics (user factors)and topic assignments to words in items
(item factors) . To avoid overtting,user and item factors are
regularized through Gaussian lin-ear regression and We show
our model is accurate, interpretableand handles both cold-start
and warm-start scenarios seam-lessly through a single model
As a by-product,fLDA also identies interesting topics that
explains user-item interactions. Our method also generalizes a
recentlyproposed technique called supervised LDA (sLDA) to
col-laborative
itemtopic

ltering

vectors

in

applications.
a

supervised

While

sLDA

fashion

for

estimates
a

single

regression,fLDA incorporates multiple regressions (one for each


user)in estimating the item factors.
ADVANTAGE
A new factorization model based on Latent Dirichlet
Allocation for predicting dyadic response that is both
accurate and interpretable when items have a bag-ofwords like representation. the predictions are inuencedby ratings even
for new items.
DISADVANTAGE
Our method works by regularizing both user anditem
factors simultaneously through user features and thebag
of words associated with each item. Specically

item is associated with a discrete latent factor often


referred to as the topic of the word; item topics are
obtained by averaging topics across all words in an item.

You might also like