In times of the digitization of entire branches of the economy, the known procedure has either been completely redesigned, or at least optimized at one point or the other. What may sound harmless to one or the other citizen, however, has fatal consequences for one’s own economic life. For example, in the world of finance, where the credit system is probably the biggest influence of digitization – keyword algorithms.
Only the credit rating is crucial in the granting of credit? Not at all!
In times when the so-called fin-tech companies, in particular, are making promises of ” fast loans ” and ” lending money at any time of day or night ” in contravention of the traditional credit market, “phenomena” arise based on past experience for ordinary gas users hardly explain. There is a solid income, the credit history is not really bad, and otherwise the classic requirements for successfully completing a loan application seem to be in place. And yet it happens again and again that, despite apparently good conditions, an online loan requested by a FinTech company via the Internet is rejected. The question of the “why?” Is then legitimate, but remains largely unanswered by the credit providers. So what is behind that, especially with the online loans, which are offered by modern internet banks without branch network, such rejections arise? The reason for this lies in the use of so-called algorithms. Complex and thus complicated mathematical formulas that decide in seconds whether a customer is creditworthy or not.
Data sets create standards thanks to algorithms – to the detriment of some customers
Anyone who has access to large amounts of data or can generate them using appropriate software is also able to evaluate these data volumes according to certain criteria and, depending on their purpose, to draw appropriate, supposedly logical conclusions. Conclusions, which can then be used as a basis for decision-making – for example, when granting loans.
Examples like? For example, if in a given quarter of a city the default rate for loans is particularly high in the past 12 months, borrowers from that district often face rejection, or at least significantly worse, borrowing despite other positive lending criteria.
Another example: If, for example, an analysis reveals that credit losses are particularly associated with particular first names, this can also lead to a negative impact on a credit decision.
However, the problem is that while credit algorithms speed up the process from a technical point of view, they also run the risk of reproducing “prejudices” and thus “objectivity” and “fair valuation” of an application Completely disregard credit.