Powered by OpenAIRE graph
Found an issue? Give us feedback

Facebook UK

11 Projects, page 1 of 3
  • Funder: UK Research and Innovation Project Code: EP/S013008/1
    Funder Contribution: 974,935 GBP

    While the traditional, deductive approach to logic begins with premisses and in step-by-step fashion applies proof rules to derive conclusions, the complementary reductive approach instead begins with a putative conclusion and searches for premisses sufficient for a legitimate derivation to exist by systematically reducing the space of possible proofs. Not only does this picture more closely resemble the way in which mathematicians actually prove theorems and, more generally, the way in which people solve problems using formal representations, it also encapsulates diverse applications of logic in computer science such as the programming paradigm known as logic programming, the proof-search problem at the heart of AI and automated theorem proving, precondition generation in program verification and more. It is also reflected at the level of truth-functional semantics --- the perspective on logic utilized for the purpose of model checking and thus verifying the correctness of industrial systems --- wherein the truth value of a formula is calculated according to the truth values of its constituent parts. Despite the reductive viewpoint reflecting logic as it is actually used, and in stark contrast to deductive logic, a uniform mathematical foundation for reductive logic does not exist. Substantial background is provided by the work of Pym, Ritter, and Wallen, but this is essentially restricted to classical and intuitionistic logic and, even then, lacks an explicit theory of the computational processes involved. We believe coalgebra --- a unifying mathematical framework for computation, state-based systems and decomposition, for which Silva is a leading contributor and exponent --- can be applied to this end. Deduction is essentially captured by inductive constructions, but reduction is captured through the coalgebraic technique of coinduction, which decomposes goals down into subgoals. Existing work shows that coalgebra generalizes truth-functional semantics and can represent basic aspects of search spaces. We will systematize this work to logics in full generality and, by utilizing the coalgebraic approach to the modelling of computation, also capture the control procedures required for proof-search. The algebraic properties of coalgebra should ensure that all aspects of this modelling, including the definitions of logics, their search spaces, and their search procedures, will be compositional. Beyond this advance on the state of the art in semantic approaches to proof-search, we can hope to utilize coalgebraic presentations of computation to achieve much more. By interfacing coalgebraic models of proof-search with coalgebraic models of, for example, probabalistic computation or programming languages, we can hope to give a clean, generic and modular presentation of applications of the reductive logic viewpoint as diverse as inductive logic programming and abduction-based Separation Logic tools such as Facebook's Infer. Abstracting the key features of such systems into a modular semantic framework can help with more than simply understanding how existing tools work and can be improved. Such a framework can also guide the design and implementation of new tools. Thus, in tandem with our theoretical development, we will develop efficient, semantically driven automated reasoning support with wide application. In doing so we can thus hope to implement tools capable of deployment for a large range of reasoning problems and guide the design of theorem provers for specific logics.

    more_vert
  • Funder: UK Research and Innovation Project Code: MR/V024299/1
    Funder Contribution: 1,499,920 GBP

    When your PC power is cut off, or your mobile phone runs out of battery, anything you were working on at the time (e.g. a game of Solitaire) is lost forever. This is because the RAM, the memory your device uses to store data temporarily, is wiped clean whenever it loses power. However, documents you saved on your PC or the photos you took on your phone are still on your device when it switches back on. This is because they are saved to the hard drive, which is permanent storage that keeps its data even if it loses power. Recently, technology companies have manufactured a new type of RAM, non-volatile memory or NVM, that does not lose its data when it loses power. This means that when a device switches back on after losing power or something catastrophic like a crash, its data is still available on NVM; we say that its data persists. This means that with careful engineering we may recover the data and not lose our work. However, this is not quite so straightforward. Modern devices are very fast, and to do this they use clever methods to get work done efficiently. For example, they do multiple tasks all at once or do a list of tasks in a different order. Sometimes this means leaving your data in a bad state temporarily and fixing it later, e.g. deleting your old data before saving a new version. If we lose power during a bad temporary state (e.g. after deleting old data but before saving its replacement), then when we restart the device we may recover bad data. This has many unfortunate consequences, from simply losing data to causing errors in our software. My research project will solve these problems, by studying NVM use from the perspective of hardware (e.g. our phones), software (e.g. our phone apps), and theoretical analysis. I will develop new tools and techniques that will help us build persistent technology, and then use formal (mathematical) methods to prove that these tools and techniques are safe and correct. My proposed research has three key components. First, I will create NVM 'persistency models', which are rigorous ways of describing exactly what NVM can/cannot do, with mathematical precision. I will then use specialised tools to test Intel and ARM microchips (in our PCs and phones) against my models, and see how they behave when using NVM. Verifying that real-world hardware behaves as expected is an important step towards safe and reliable NVM, as it provides a safe foundation to write software on top of. Second, I will extend modern programming languages to enable writing programs (software) that can control how data persists to NVM, which in turn makes it easier and safer to recover NVM data. Currently it is impossible to write such programs, because NVM is such a new concept that persistence control is not a part of modern programming languages. I will extend these languages and provide example programs and tests. I will then prove that these extensions are correct so that software companies can rely on them to build their future products. Finally, I will develop ways to test and verify that programs safely recover NVM data. Testing is an important part of hardware and software development, but testing NVM persistency is currently infeasible: the only way to do this currently is to run thousands of tests, each time cutting the power at different times. However, forcing such frequent power losses is both impractical and inefficient. I will develop new ways to test NVM persistency, which is the final key step for widespread NVM adoption. NVM could save untold amounts of data, money and time every year. Data loss is faced by not only people who use computers every day, but also data centres and safety-critical technologies worldwide. NVM can make data loss a thing of the past, but requires a rigorous, safe foundation to be built on, to avoid trading one kind of unpredictability for another. This research project will ensure that foundation, and unleash the potential of this new technology.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/P004172/1
    Funder Contribution: 326,972 GBP

    The recent work on System-Level Games provides a semantic framework for modelling low-level code interactions involving resources shared between a program and its environment. This project will apply the framework for deriving compositional analysis techniques for software compilation and verification. Semantically, the project will produce a paradigmatic model for programs made of combinations of arbitrary low- and high-level code fragments. By directly examining the interaction traces of code, as produced by our model, we will extract trace-based analyses based on co-induction, as well as behavioural types governing game plays. By trace analysis we will also produce syntax-independent compilation analyses on tamper-resistance and code linking. These techniques will be applied on the prototype language Verity and will deliver significant improvements to its existing GOS compiler, which is particularly suitable a test-bench because of its high degree of heterogeneity both in terms of languages (functional vs. hardware) and platforms (CPU-based vs. FPGA).

    more_vert
  • Funder: UK Research and Innovation Project Code: MR/T043830/1
    Funder Contribution: 1,159,760 GBP

    In an economy in which the UK software industry added £139 billion of value to UK GDP in 2018 alone, this fellowship will transform the way developers write software. By offering a radically new way of composing and customising programs, I will empower software developers to build more flexible, maintainable, and robust software. Computers must interact with the real world. In computer programs real world effects are pervasive, e.g.: concurrency (performing two computations at once), distribution (performing computation in different places), input, output, and probability (e.g. for machine learning). Effect handlers are a general programming feature that can be used to implement all of these effects. They were introduced by theoretical computer scientists as part of a mathematical model of effects. Thanks in part to my efforts they now show promise as a practical programming tool. Interest in effect handlers in industry is growing. For instance, Facebook's React Fiber, the core of the market-leading React user interface library for web applications, is directly inspired by effect handlers, and Uber's Pyro tool for probabilistic programming and statistical inference makes essential use of effect handlers. Preliminary results suggest effect handlers have the potential to support efficient implementations. Nevertheless, existing implementations are in their infancy and research is required to make them scale, both in terms of ease of programming and in terms of performance. I will develop the theory and practice of Effect Handler Oriented Programming as a uniform foundation for modular and efficient implementation of effects. I will develop both high-level (for humans) and low-level (for machines) effect handler designs and implementations. In collaboration with my project partners I will ensure that EHOP has direct impact through two key technologies. + Hack. I will add effect handlers to Hack, the high-level programming language in which the Facebook server is written. Hack currently provides ad hoc support for features such as concurrency and probabilistic programming, which I will replace by effect handlers, eliminating the need to maintain several ad hoc features, and introducing fine-grained control over details such as scheduling strategies. This will allow more flexible, maintainable, and robust software to be written in Hack, ultimately improving the user experience of billions of Facebook users. + WebAssembly. I will design and implement an effect handler extension for WebAssembly, a portable low-level bytecode supported by the top four browser vendors and designed to supersede JavaScript as the target language for the web. Currently, languages such as JavaScript provide a collection of ad hoc overlapping concurrency features. Each of these is hard-wired and has to be maintained separately. However, all of them can be implemented with minimal effort using effect handlers. Rather than hard-wiring and maintaining several ad hoc features, compiler developers will be able to rely on a single implementation of effect handlers in WebAssembly. This will enable more flexible, maintainable, and reliable programming language implementations, ultimately improving the user experience of billions of web users. A "killer app" for effect handlers is concurrency and distribution. A central aspect of concurrency and distribution is communication. For communication to be safe, secure, and reliable, all parties must comply with appropriate protocols. Session types are a nascent technology for enforcing protocol compliance. Unifying the two main threads of my research over the last half decade, I will extend the theory and practice of effect handlers to enable session-typed concurrency and distribution features to be defined as effect handlers. Ultimately, this will enable safe, secure, and reliable communication infrastructure for billions of end users.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/W025493/1
    Funder Contribution: 839,659 GBP

    We address the timely topic of online gender-based violence (GBV): Almost 1 in every 2 women and non-binary people (46%) reported experiencing online abuse since the beginning of COVID-19 (Glitch report, 2020). Our aim is to create 'equally safe' online spaces by prevention, intervention and support for online GBV through the development of advanced Machine Learning algorithms. In contrast to previous research in this area, our team will include experts on GBV and online harassment to ensure that we frame the problem in a way which is most helpful to victims/survivors of GBV. In other words, we not only focus on *how* to automatically detect online abuse, but also re-think *what* it is we need to detect, how we can *support* the victims and how to *prevent* online GBV through promoting digital citizenship (i.e. prevention and intervention aimed at perpetrators and bystanders). Our methodology is based on the Scottish Government's Equally Safe strategy and implements a Responsible Innovation Approach in a close collaboration with 3rd sector charities with a long-standing track record in this domain. Our proposal aims to create the following impacts: 1. Create longterm technical solutions to support safer online spaces, including advanced abuse detection tools, tools to automatically generate counter narratives aimed at perpetrators and bystanders, and a chatbot for providing proactive support to victims/survivors. 2. Promote 'equitable' algorithms which are able to reflect multiple perspectives/viewpoints and not marginalise minority views; 3. Increase digital literacy concerning the safe use of social media from an early age.

    more_vert
  • chevron_left
  • 1
  • 2
  • 3
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.