How changes in synaptic connections lead to learning and memory is a central question in Neuroscience. Previous modeling efforts have focused on biologically realistic learning rules and dynamics, but no known model has been shown to perform successful learning while preserving a realistic distribution of synaptic weights (lognormal), as found experimentally. It thus remains unknown how constraining the initial and final distribution of synaptic weights impacts the operational principles of the learning rule as well as the capacity of the network. Here we set up a spiking neural network with a "trading floor" of synaptic weights, where weights can be swapped between excitatory synapses according to a functional or structural plasticity rule. Swapping allows for retention of the distribution while remaining agnostic to the implementation of the learning rule. We then test the network for pattern completion of corrupted visual inputs as a measure of memory. We find that while both functional and structural rules lead to pattern completion, the minimum synaptic change necessary to store a pattern and the resulting dynamics differ. We find that functional plasticity requires broad reconfiguration of weights, but is self-stabilizing and does not lead to runaway excitation. In contrast, structural plasticity of a small number of connections is sufficient for learning, yet results in aberrant network behavior. To explore the capacity of the network, we swap synapses in response to multiple stimuli of increasing sizes. This work thus ties together network topology, capacity and limitations of synaptic plasticity rules.