Papers by MohammadAmin Fazli
FNR (Fake News Revealer): A Similarity and Transformer-Based Approach to Detect Multi-Modal Fake ... more FNR (Fake News Revealer): A Similarity and Transformer-Based Approach to Detect Multi-Modal Fake News in Social Media.
Multi-view approach to suggest moderation actions in community question answering sites
Information Sciences
ArXiv, 2021
Portfolio optimization is one of the essential fields of focus in finance. There has been an incr... more Portfolio optimization is one of the essential fields of focus in finance. There has been an increasing demand for novel computational methods in this area to compute portfolios with better returns and lower risks in recent years. We present a novel computational method called Representation Portfolio Selection (RPS) by redefining the distance matrix of financial assets using Representation Learning and Clustering algorithms for portfolio selection to increase diversification. RPS proposes a heuristic for getting closer to the optimal subset of assets. Using empirical results in this paper, we demonstrate that widely used portfolio optimization algorithms, such as MVO, CLA, and HRP, can benefit from our asset subset selection.

ArXiv, 2019
Today's most prominent IT companies are built on the extraction of insight from data, and dat... more Today's most prominent IT companies are built on the extraction of insight from data, and data processing has become crucial in data-intensive businesses. Nevertheless, the size of data which should be processed is growing significantly fast. The pace of the data growing has changed the nature of data processing. Today, data-intensive industries demand highly scalable and fault tolerant data processing architectures which can handle the massive amount of data. In this paper, we presented a distributed architecture for elastic and resilient data processing based on the Liquid which is a nearline and offline big data architecture. We used the Reactive Manifesto to design the architecture highly reactive to workload changes and failures. We evaluate our architecture by drawing some numerical comparisons between our architecture prototype and the Liquid prototype. The performed evaluation shows that our architecture can be more scalable against workload and more resilient against fa...
ArXiv, 2021
Non Fungible Tokens (NFTs) have gained a solid foothold within the crypto community, and substant... more Non Fungible Tokens (NFTs) have gained a solid foothold within the crypto community, and substantial amounts of money have been allocated to their trades. In this paper, we studied one of the most prominent marketplaces dedicated to NFT auctions and trades, Foundation. We analyzed the activities on Foundation and identified several intriguing underlying dynamics that occur on this platform. Moreover, We performed social network analysis on a graph that we had created based on transferred NFTs on Foundation, and then described the characteristics of this graph. Lastly, We built a neural network-based similarity model for retrieving and clustering similar NFTs. We also showed that for most NFTs, their performances in auctions were comparable with the auction performance of other NFTs in their cluster.

We propose a novel hybrid learning approach to gain situation awareness in smart environments by ... more We propose a novel hybrid learning approach to gain situation awareness in smart environments by introducing a new situation identifier that combines an expert system and a machine learning approach. Traditionally, expert systems and machine learning approaches have been widely used independently to detect ongoing situations as the main functionality in smart environments in various domains. Expert systems lack the functionality to adapt the system to each user and are expensive to design based on each setting. On the other hand, machine learning approaches fail in the challenge of cold start and making explainable decisions. Using both of these approaches enables the system to use user’s feedback and capture environmental changes while exploiting the initial expert knowledge to solve the mentioned challenges. We use decision trees and situation templates as the core structure to interpret sensor data. To evaluate the proposed method, we generate a new human-annotated dataset simula...

This paper aims to investigate how a central authority (e.g. a government) can increase social we... more This paper aims to investigate how a central authority (e.g. a government) can increase social welfare in a network of markets and firms. In these networks, modeled using a bipartite graph, firms compete with each other \textit{a la} Cournot. Each firm can supply homogeneous goods in markets which it has access to. The central authority may take different policies for its aim. In this paper, we assume that the government has a budget by which it can supply some goods and inject them into various markets. We discuss how the central authority can best allocate its budget for the distribution of goods to maximize social welfare. We show that the solution is highly dependent on the structure of the network. Then, using the network's structural features, we present a heuristic algorithm for our target problem. Finally, we compare the performance of our algorithm with other heuristics with experimentation on real datasets.

Social norms play an important role in regulating the behavior of societies. They are behavioral ... more Social norms play an important role in regulating the behavior of societies. They are behavioral standards that are considered acceptable in a group or society and violating them will result in sanction to violator. Both governments and various cultural communities use this social component to solve various problems in society. The use of norms leads to a large reduction in community spending to control harmful behaviors. Social norms have two important aspects of promulgating and sanctioning. They are promulgated by activists in the community and, after creation, are endorsed with a sanction. Norms can be used to promote a variety of different behaviors. Online social networks have established a new and influential platform for promulgating social norms. We first redefined the Rescorla-Wagner conditional learning model in the context of social norms with the help of a norm’s intrinsic properties, and extract the main coefficients in the Rescorla-Wagner model related to it. Based on...

2020 6th International Conference on Web Research (ICWR), 2020
Modern Question-Answering websites, such as StackOverflow and Quora, have specific user rules to ... more Modern Question-Answering websites, such as StackOverflow and Quora, have specific user rules to maintain their content quality. These systems rely on user reports for accessing new contents, which has serious problems including the slow handling of violations, the loss of normal and experienced users' time, the low quality of some reports, and discouraging feedback to new users. Therefore, with the overall goal of providing solutions for automating moderation actions in Q&A websites, we aim to provide a model to predict 20 quality or subjective aspects of questions in QA websites. To this end, we used data gathered by the CrowdSource team at Google Research in 2019 and fine-tuned pre-trained BERT model on our problem. Model achieves 95.4% accuracy after 2 epochs of training and did not improve substantially in the next ones. Results confirm that by simple fine-tuning, we can achieve accurate models, in little time, and on less amount of data.

Heliyon, 2019
On-demand resource provisioning and elasticity are two of the main characteristics of the cloud c... more On-demand resource provisioning and elasticity are two of the main characteristics of the cloud computing paradigm. As a result, the load on a cloud service provider (CSP) is not fixed and almost always a number of its physical resources are not used, called spare resources. As the CSPs typically don't want to be overprovisioned at any time, they procure physical resources in accordance to a pessimistic forecast of their loads and this leads to a large amount of spare resources most of the time. Some CSPs rent their spare resources with a lower price called the spot price, which varies over time with respect to the market or the internal state of the CSP. In this paper, we assume the spot price to be a function of the CSP's load. We introduce the concept of a parasite CSP, which rents spare resources from several CSPs simultaneously with spot prices and rents them to its customers with an on-demand price lower than the host CSPs' on-demand prices. We propose the overall architecture and interaction model of the parasite CSP. Mathematical analysis has been made to calculate the amount of spare resources of the host CSPs, the amount of resources that the parasite CSP can rent (its virtual capacity) as well as the probability of SLA violations. We evaluate our analysis over pricing data gathered from Amazon EC2 services. The results show that if the parasite CSP relies on several host CSPs, its virtual capacity can be considerable and the expected penalty due to SLA violation is acceptably low.

IEEE Transactions on Network Science and Engineering, 2019
Controlling networked dynamical systems is a challenging endeavor, specifically keeping in mind t... more Controlling networked dynamical systems is a challenging endeavor, specifically keeping in mind the fact that in many 5 scenarios the actors engaged in the dynamism behave selfishly, only taking into account their own individual utility. This setting has 6 been widely studied in the field of game theory. One way we can control system dynamics is through the use of control parameters that 7 are at our disposal, but finding optimal values for these parameters is complex and time consuming. In this paper we use the relation 8 between network structural properties and control parameters to create a mathematical model that speeds up the calculation of the 9 aforementioned values. For this, we use learning methods to find optimal values that can control the system dynamics based on the 10 correlation between structurally similar networks.

Future Generation Computer Systems, 2019
One of the main questions in cloud computing environments is how to efficiently distribute user r... more One of the main questions in cloud computing environments is how to efficiently distribute user requests or Virtual Machines (VMs) based on their resource needs over time. This question is also an important one when dealing with a cloud federation environment where rational cloud service providers are collaborating together by sharing customer requests. By considering intrinsic aspects of the cloud computing model one can propose request distribution methods that play on the strengths of this computing paradigm. In this paper we look at statistical multiplexing and server consolidation as such a strength and examine the use of the coefficient of variation and other related statistical metrics as objective functions which can be used in deciding on the request distribution mechanism. The complexity of using these objective functions is analyzed and heuristic methods which enable efficient request partitioning in a feasible time are presented & compared.

IEEE Transactions on Computational Social Systems, 2018
Social norms are a core concept in social sciences and play a critical role in regulating a socie... more Social norms are a core concept in social sciences and play a critical role in regulating a society's behavior. Organizations and even governmental bodies use this social component to tackle varying challenges in the society, as it is a less costly alternative to establishing new laws and regulations. Social networks are an important and effective infrastructure in which social norms can evolve. Therefore, there is a need for theoretical models for studying the spread of social norms in social networks. In this paper, by using the intrinsic properties of norms, we redefine and tune the Rescorla-Wagner conditioning model in order to obtain an effective model for the spread of social norms. We extend this model for a network of people as a Markov chain. The potential structures of steady states in this process are studied. Then, we formulate the problem of maximizing the adoption of social norms in a social network by finding the best set of initial norm adopters. Finally, we propose an algorithm for solving this problem that runs in polynomial time and experiments it on different networks. Our experiments show that our algorithm has superior performance over other methods.
Operations Research Letters, 2015
In this paper, we consider the following problem: given an undirected graph G = (V , E) and an in... more In this paper, we consider the following problem: given an undirected graph G = (V , E) and an integer k, find I ⊆ V 2 with |I| ≤ k in such a way that G ′ = (V , E ∪ I) has the maximum number of triangles (a cycle of length 3). We first prove that this problem is NP-hard and then give an approximation algorithm for it.
2012 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining, 2012
Modeling is one of the major research areas in social network analysis whose goal is to study net... more Modeling is one of the major research areas in social network analysis whose goal is to study networks structure and its evolution. Motivated by the intuition that members in social networks behave selfishly, network creation games have been introduced for modeling social networks. In this paper, our aim is to measure how much the output graphs of a given network creation game are compatible with a social network. We first show that the precise measurement is not possible in polynomial time. Then we propose a method for its approximation; finally, we show the usability of our method by conducting experiments on real network data.

Journal of Combinatorial Optimization, 2015
We consider a class of optimization problems called movement minimization on euclidean plane. Giv... more We consider a class of optimization problems called movement minimization on euclidean plane. Given a set of nodes on the plane, the aim is to achieve some specific property by minimum movement of the nodes. We consider two specific properties, namely the connectivity (Con) and realization of a given topology (Topol). By minimum movement, we mean either the sum of all movements (Sum) or the maximum movement (Max). We obtain several approximation algorithms and some hardness results for these four problems. We obtain an O(m)-factor approximation for ConMax and ConSum and an O( m/OP T )-factor approximation for ConMax. We also extend some known result on graphical grounds in [?, ?] and obtain inapproximability results on the geometrical grounds. For the Topol problem (where the final decoration of the nodes must correspond to a given configuration), we find it much simpler and provide FPTAS for both Max and Sum versions.
Proceedings of the 23rd ACM symposium on Parallelism in algorithms and architectures - SPAA '11, 2011
We consider a network creation game in which each player (vertex) has a fixed budget to establish... more We consider a network creation game in which each player (vertex) has a fixed budget to establish links to other players. In our model, each link has unit price and each agent tries to minimize its cost, which is either its local diameter or its total distance to other players in the (undirected) underlying graph of the created network. Two versions of the game are studied: in the MAX

Lecture Notes in Computer Science, 2012
The spread of influence in social networks is studied in two main categories: the progressive mod... more The spread of influence in social networks is studied in two main categories: the progressive model and the non-progressive model (see e.g. the seminal work of Kempe, Kleinberg, and Tardos in KDD 2003). While the progressive models are suitable for modeling the spread of influence in monopolistic settings, non-progressive are more appropriate for modeling non-monopolistic settings, e.g., modeling diffusion of two competing technologies over a social network. Despite the extensive work on the progressive model, non-progressive models have not been studied well. In this paper, we study the spread of influence in the nonprogressive model under the strict majority threshold: given a graph G with a set of initially infected nodes, each node gets infected at time τ iff a majority of its neighbors are infected at time τ − 1. Our goal in the MinPTS problem is to find a minimum-cardinality initial set of infected nodes that would eventually converge to a steady state where all nodes of G are infected. We prove that while the MinPTS is NP-hard for a restricted family of graphs, it admits an improved constant-factor approximation algorithm for power-law graphs. We do so by proving lower and upper bounds in terms of the minimum and maximum degree of nodes in the graph. The upper bound is achieved in turn by applying a natural greedy algorithm. Our experimental evaluation of the greedy algorithm also shows its superior performance compared to other algorithms for a set of realworld graphs as well as the random power-law graphs. Finally, we study the convergence properties of these algorithms and show that the nonprogressive model converges in at most O(|E(G)|) steps.
Lecture Notes in Computer Science, 2011
We study a classical problem in communication and wireless networks called Finding White Space Re... more We study a classical problem in communication and wireless networks called Finding White Space Regions. In this problem, we are given a set of antennas (points) some of which are noisy (black) and the rest are working fine (white). The goal is to find a set of convex hulls with maximum total area that cover all white points and exclude all black points. In other words, these convex hulls make it safe for white antennas to communicate with each other without any interference with black antennas. We study the problem on three different settings (based on overlapping between different convex hulls) and find hardness results and good approximation algorithms.

Journal of Combinatorial Optimization, 2014
Assume that we are given a set of points some of which are black and the rest are white. The goal... more Assume that we are given a set of points some of which are black and the rest are white. The goal is to find a set of convex polygons with maximum total area that cover all white points and exclude all black points. We study the problem on three different settings (based on overlapping between different convex polygons): (1) In case convex polygons are permitted to have common area, we present a polynomial algorithm. (2) In case convex polygons are not allowed to have common area but are allowed to have common vertices, we prove the NP-hardness of the problem and propose an algorithm whose output is at least O PT log(2n/O PT)+2log(n) 1/4. (3) Finally, in case convex polygons are not allowed to have common area or common vertices, also we prove the NP-hardness of the problem and propose an algorithm whose output is at least 3 √ 3 4.π O PT log(2n/O PT)+2log(n)
Uploads
Papers by MohammadAmin Fazli